Oct 02 11:09:58 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 11:09:58 crc restorecon[4729]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:58 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 11:09:59 crc restorecon[4729]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 11:09:59 crc kubenswrapper[4929]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.871049 4929 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880252 4929 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880711 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880721 4929 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880731 4929 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880740 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880750 4929 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880759 4929 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880772 4929 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880783 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880792 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880801 4929 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880809 4929 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880817 4929 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880825 4929 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880833 4929 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880842 4929 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880850 4929 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880857 4929 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880865 4929 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880873 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880880 4929 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880888 4929 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880903 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880914 4929 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880923 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880931 4929 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880939 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880947 4929 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880982 4929 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880990 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.880998 4929 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881007 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881014 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881022 4929 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881044 4929 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881055 4929 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881245 4929 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881255 4929 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881265 4929 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881275 4929 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881285 4929 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881295 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881305 4929 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881315 4929 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881324 4929 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881333 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881343 4929 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881352 4929 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881362 4929 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881372 4929 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881380 4929 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881389 4929 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881401 4929 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881410 4929 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881420 4929 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881429 4929 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881441 4929 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881455 4929 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881466 4929 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881477 4929 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881489 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881499 4929 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881509 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881519 4929 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881528 4929 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881537 4929 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881548 4929 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881556 4929 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881565 4929 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881572 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.881580 4929 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881769 4929 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881789 4929 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881825 4929 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881837 4929 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881849 4929 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881860 4929 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881872 4929 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881884 4929 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881894 4929 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881903 4929 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881913 4929 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881922 4929 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881932 4929 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881942 4929 flags.go:64] FLAG: --cgroup-root="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881950 4929 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881990 4929 flags.go:64] FLAG: --client-ca-file="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.881999 4929 flags.go:64] FLAG: --cloud-config="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882008 4929 flags.go:64] FLAG: --cloud-provider="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882018 4929 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882202 4929 flags.go:64] FLAG: --cluster-domain="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882210 4929 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882221 4929 flags.go:64] FLAG: --config-dir="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882229 4929 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882239 4929 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882250 4929 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882260 4929 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882269 4929 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882278 4929 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882288 4929 flags.go:64] FLAG: --contention-profiling="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882297 4929 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882306 4929 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882316 4929 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882325 4929 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882337 4929 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882346 4929 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882355 4929 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882365 4929 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882375 4929 flags.go:64] FLAG: --enable-server="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882384 4929 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882395 4929 flags.go:64] FLAG: --event-burst="100" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882405 4929 flags.go:64] FLAG: --event-qps="50" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882414 4929 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882423 4929 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882432 4929 flags.go:64] FLAG: --eviction-hard="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882443 4929 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882452 4929 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882461 4929 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882470 4929 flags.go:64] FLAG: --eviction-soft="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882479 4929 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882488 4929 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882497 4929 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882506 4929 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882515 4929 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882526 4929 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882535 4929 flags.go:64] FLAG: --feature-gates="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882546 4929 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882557 4929 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882566 4929 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882576 4929 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882585 4929 flags.go:64] FLAG: --healthz-port="10248" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882595 4929 flags.go:64] FLAG: --help="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882604 4929 flags.go:64] FLAG: --hostname-override="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882612 4929 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882622 4929 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882631 4929 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882641 4929 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882649 4929 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882658 4929 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882667 4929 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882676 4929 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882685 4929 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882694 4929 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882704 4929 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882716 4929 flags.go:64] FLAG: --kube-reserved="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882725 4929 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882734 4929 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882743 4929 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882752 4929 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882762 4929 flags.go:64] FLAG: --lock-file="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882771 4929 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882782 4929 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882791 4929 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882805 4929 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882815 4929 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882824 4929 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882835 4929 flags.go:64] FLAG: --logging-format="text" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882844 4929 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882854 4929 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882863 4929 flags.go:64] FLAG: --manifest-url="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882871 4929 flags.go:64] FLAG: --manifest-url-header="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882883 4929 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882892 4929 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882908 4929 flags.go:64] FLAG: --max-pods="110" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882917 4929 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882926 4929 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882935 4929 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882945 4929 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882980 4929 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882989 4929 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.882999 4929 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883019 4929 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883028 4929 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883039 4929 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883048 4929 flags.go:64] FLAG: --pod-cidr="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883057 4929 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883072 4929 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883081 4929 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883091 4929 flags.go:64] FLAG: --pods-per-core="0" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883100 4929 flags.go:64] FLAG: --port="10250" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883110 4929 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883119 4929 flags.go:64] FLAG: --provider-id="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883128 4929 flags.go:64] FLAG: --qos-reserved="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883137 4929 flags.go:64] FLAG: --read-only-port="10255" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883147 4929 flags.go:64] FLAG: --register-node="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883156 4929 flags.go:64] FLAG: --register-schedulable="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883164 4929 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883195 4929 flags.go:64] FLAG: --registry-burst="10" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883205 4929 flags.go:64] FLAG: --registry-qps="5" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883214 4929 flags.go:64] FLAG: --reserved-cpus="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883223 4929 flags.go:64] FLAG: --reserved-memory="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883233 4929 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883243 4929 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883252 4929 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883262 4929 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883270 4929 flags.go:64] FLAG: --runonce="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883279 4929 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883288 4929 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883297 4929 flags.go:64] FLAG: --seccomp-default="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883306 4929 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883315 4929 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883324 4929 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883333 4929 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883343 4929 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883376 4929 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883388 4929 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883398 4929 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883407 4929 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883417 4929 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883426 4929 flags.go:64] FLAG: --system-cgroups="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883434 4929 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883449 4929 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883458 4929 flags.go:64] FLAG: --tls-cert-file="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883468 4929 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883481 4929 flags.go:64] FLAG: --tls-min-version="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883490 4929 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883499 4929 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883508 4929 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883517 4929 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883526 4929 flags.go:64] FLAG: --v="2" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883537 4929 flags.go:64] FLAG: --version="false" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883548 4929 flags.go:64] FLAG: --vmodule="" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883559 4929 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.883568 4929 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883790 4929 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883801 4929 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883810 4929 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883818 4929 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883829 4929 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883840 4929 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883850 4929 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883859 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883867 4929 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883876 4929 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883884 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883892 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883900 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883908 4929 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883917 4929 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883928 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883938 4929 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883947 4929 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883955 4929 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883989 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.883997 4929 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884006 4929 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884014 4929 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884022 4929 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884031 4929 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884039 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884047 4929 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884054 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884063 4929 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884071 4929 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884079 4929 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884089 4929 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884099 4929 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884108 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884118 4929 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884126 4929 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884135 4929 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884142 4929 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884151 4929 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884159 4929 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884166 4929 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884177 4929 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884185 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884193 4929 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884201 4929 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884210 4929 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884218 4929 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884226 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884235 4929 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884242 4929 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884250 4929 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884258 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884266 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884274 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884282 4929 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884289 4929 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884297 4929 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884305 4929 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884313 4929 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884321 4929 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884329 4929 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884336 4929 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884344 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884352 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884361 4929 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884369 4929 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884379 4929 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884389 4929 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884398 4929 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884407 4929 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.884415 4929 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.884439 4929 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.898724 4929 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.898784 4929 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898875 4929 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898886 4929 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898891 4929 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898895 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898911 4929 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898916 4929 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898921 4929 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898927 4929 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898931 4929 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898936 4929 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898940 4929 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898945 4929 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898949 4929 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898953 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898980 4929 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898985 4929 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.898989 4929 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899010 4929 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899016 4929 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899023 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899027 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899031 4929 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899035 4929 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899040 4929 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899060 4929 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899066 4929 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899071 4929 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899076 4929 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899082 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899089 4929 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899095 4929 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899099 4929 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899103 4929 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899108 4929 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899112 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899116 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899134 4929 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899140 4929 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899144 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899148 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899152 4929 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899157 4929 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899160 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899165 4929 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899169 4929 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899173 4929 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899177 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899184 4929 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899188 4929 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899192 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899209 4929 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899214 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899219 4929 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899222 4929 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899226 4929 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899231 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899235 4929 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899238 4929 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899242 4929 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899246 4929 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899250 4929 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899253 4929 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899257 4929 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899261 4929 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899266 4929 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899271 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899291 4929 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899295 4929 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899300 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899304 4929 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899309 4929 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.899318 4929 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899534 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899544 4929 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899562 4929 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899567 4929 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899571 4929 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899575 4929 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899578 4929 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899582 4929 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899586 4929 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899590 4929 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899594 4929 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899597 4929 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899601 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899605 4929 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899609 4929 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899613 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899617 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899621 4929 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899641 4929 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899646 4929 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899651 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899656 4929 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899660 4929 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899665 4929 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899670 4929 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899675 4929 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899679 4929 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899683 4929 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899687 4929 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899691 4929 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899695 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899699 4929 feature_gate.go:330] unrecognized feature gate: Example Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899717 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899721 4929 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899725 4929 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899729 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899733 4929 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899738 4929 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899742 4929 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899745 4929 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899749 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899752 4929 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899756 4929 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899760 4929 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899764 4929 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899770 4929 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899774 4929 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899778 4929 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899796 4929 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899801 4929 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899805 4929 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899808 4929 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899813 4929 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899817 4929 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899821 4929 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899825 4929 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899830 4929 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899834 4929 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899838 4929 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899842 4929 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899846 4929 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899850 4929 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899853 4929 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899872 4929 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899876 4929 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899880 4929 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899884 4929 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899888 4929 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899892 4929 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899896 4929 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 11:09:59 crc kubenswrapper[4929]: W1002 11:09:59.899900 4929 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.899906 4929 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.901293 4929 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.906433 4929 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.906517 4929 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.908200 4929 server.go:997] "Starting client certificate rotation" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.908239 4929 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.908447 4929 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-06 01:43:19.65268197 +0000 UTC Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.908579 4929 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2294h33m19.744106018s for next certificate rotation Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.934532 4929 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.936591 4929 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:09:59 crc kubenswrapper[4929]: I1002 11:09:59.959936 4929 log.go:25] "Validated CRI v1 runtime API" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.002705 4929 log.go:25] "Validated CRI v1 image API" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.004918 4929 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.011937 4929 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-11-05-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.012039 4929 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.042494 4929 manager.go:217] Machine: {Timestamp:2025-10-02 11:10:00.040284478 +0000 UTC m=+0.590650862 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0ee67423-5105-4391-ab46-c42062aff8c4 BootID:4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:31:a9:eb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:31:a9:eb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c0:cd:01 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6a:2b:f8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a4:cc:65 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:76:bf:b9 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:f1:8f:7c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:16:da:8e:20:7d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:06:f8:7f:89:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.042881 4929 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.043135 4929 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.044285 4929 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.044497 4929 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.044544 4929 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.044949 4929 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.044974 4929 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.045630 4929 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.045662 4929 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.046021 4929 state_mem.go:36] "Initialized new in-memory state store" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.046110 4929 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.050142 4929 kubelet.go:418] "Attempting to sync node with API server" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.050165 4929 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.050190 4929 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.050205 4929 kubelet.go:324] "Adding apiserver pod source" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.050218 4929 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.055164 4929 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.056867 4929 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.058232 4929 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.058294 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.058417 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.058547 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.058562 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060290 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060388 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060457 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060515 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060590 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060641 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060708 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060767 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060817 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060865 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060916 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.060994 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.062405 4929 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.063027 4929 server.go:1280] "Started kubelet" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.067691 4929 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.069250 4929 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.068480 4929 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.070637 4929 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 11:10:00 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.083638 4929 server.go:460] "Adding debug handlers to kubelet server" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.084348 4929 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.084415 4929 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.085032 4929 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.085040 4929 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:29:15.350052006 +0000 UTC Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.085091 4929 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1377h19m15.264963317s for next certificate rotation Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.085159 4929 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.085192 4929 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.085250 4929 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.084180 4929 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa812155e9174 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 11:10:00.062996852 +0000 UTC m=+0.613363216,LastTimestamp:2025-10-02 11:10:00.062996852 +0000 UTC m=+0.613363216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.086146 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.086245 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.086814 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087027 4929 factory.go:55] Registering systemd factory Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087061 4929 factory.go:221] Registration of the systemd container factory successfully Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087500 4929 factory.go:153] Registering CRI-O factory Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087539 4929 factory.go:221] Registration of the crio container factory successfully Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087698 4929 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087749 4929 factory.go:103] Registering Raw factory Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.087787 4929 manager.go:1196] Started watching for new ooms in manager Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.089151 4929 manager.go:319] Starting recovery of all containers Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099608 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099746 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099776 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099803 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099829 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099855 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099881 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099907 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.099938 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100033 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100071 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100102 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100130 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100163 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100189 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100221 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100259 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100287 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100316 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100350 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100377 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100409 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100437 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100468 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100498 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100529 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100565 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100597 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100632 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100658 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100685 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100711 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100739 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100767 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100797 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100826 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100852 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100877 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100906 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.100938 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101003 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101036 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101072 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101100 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101127 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101155 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101187 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101217 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101248 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101278 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101306 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101336 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101376 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101407 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101437 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101473 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101503 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101533 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101564 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101592 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101620 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101650 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101677 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101706 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101736 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101762 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101790 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101817 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101846 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101875 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101907 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101938 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.101999 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102033 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102062 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102091 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102121 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102149 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102175 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102202 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102238 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102276 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102302 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102333 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102360 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102389 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102420 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102493 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102520 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102551 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102579 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102608 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102635 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102663 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102693 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102720 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102746 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102771 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102800 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102862 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102892 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102919 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.102946 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103039 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103081 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103111 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103144 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103175 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103209 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103239 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103268 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103298 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103327 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103356 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103384 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103414 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103440 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103494 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103521 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103552 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103580 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103606 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103633 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103661 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103688 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103717 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103746 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103773 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103801 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103829 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103862 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103888 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.103929 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104096 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104137 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104171 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104204 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104232 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104263 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104291 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104321 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104348 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104374 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104402 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104431 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104459 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104486 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104517 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104546 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104574 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104601 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104633 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104660 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104688 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104718 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104746 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104773 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104802 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104830 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104857 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104906 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104934 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.104998 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105029 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105068 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105097 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105123 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105153 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105181 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105208 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105235 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.105261 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110398 4929 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110497 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110540 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110574 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110644 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110680 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110724 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110756 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110787 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110827 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110857 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110900 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.110930 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111001 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111045 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111078 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111190 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111250 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111286 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111335 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111370 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111411 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111489 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111535 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111583 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111623 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111656 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111706 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111738 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111784 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111821 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111860 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111905 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.111946 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.112058 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.112096 4929 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.112123 4929 reconstruct.go:97] "Volume reconstruction finished" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.112158 4929 reconciler.go:26] "Reconciler: start to sync state" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.131138 4929 manager.go:324] Recovery completed Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.141432 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.142834 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.142884 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.142900 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.143900 4929 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.143919 4929 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.143938 4929 state_mem.go:36] "Initialized new in-memory state store" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.150691 4929 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.154680 4929 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.155289 4929 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.155363 4929 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.155457 4929 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.159825 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.159941 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.170449 4929 policy_none.go:49] "None policy: Start" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.171450 4929 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.171472 4929 state_mem.go:35] "Initializing new in-memory state store" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.185158 4929 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255064 4929 manager.go:334] "Starting Device Plugin manager" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255120 4929 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255134 4929 server.go:79] "Starting device plugin registration server" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255587 4929 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255604 4929 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.255630 4929 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.255839 4929 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.256082 4929 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.256096 4929 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.270120 4929 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.288074 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.356313 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.358047 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.358115 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.358135 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.358194 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.358891 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.456069 4929 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.456256 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.458265 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.458322 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.458352 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.458671 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.459833 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.459922 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461276 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461324 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461291 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461369 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461400 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461425 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.461754 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.462192 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.462225 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.464320 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.464343 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.464352 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.465521 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.465573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.465593 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.466462 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.466527 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.476670 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477643 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477696 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477868 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477911 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.477933 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.478241 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.478370 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.478412 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.482454 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.482502 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.482519 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.483056 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.483093 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.483111 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.483492 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.483546 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.485845 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.485895 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.485908 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517144 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517196 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517216 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517234 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517255 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517274 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517292 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517309 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517326 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517396 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517475 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517498 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517577 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.517639 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.560036 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.562153 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.562212 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.562245 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.562299 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.563161 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619396 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619440 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619470 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619495 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619518 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619540 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619565 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619596 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619628 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619717 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619750 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619776 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619801 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.619825 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620228 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620245 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620332 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620358 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620385 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620412 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620442 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620449 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620487 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620519 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620517 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620559 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620467 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620639 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.620329 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.689509 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.808936 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.829868 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.842617 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.855343 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-57271b7f720bbb4b2fdf75c44245c15625834e875072727cd47fb951746579a5 WatchSource:0}: Error finding container 57271b7f720bbb4b2fdf75c44245c15625834e875072727cd47fb951746579a5: Status 404 returned error can't find the container with id 57271b7f720bbb4b2fdf75c44245c15625834e875072727cd47fb951746579a5 Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.868825 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.873696 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-3ebd0f258ad9ebcfea45f0ddad3e5e7064c9d3ba60ba0e920d7cf23bf1bb1c81 WatchSource:0}: Error finding container 3ebd0f258ad9ebcfea45f0ddad3e5e7064c9d3ba60ba0e920d7cf23bf1bb1c81: Status 404 returned error can't find the container with id 3ebd0f258ad9ebcfea45f0ddad3e5e7064c9d3ba60ba0e920d7cf23bf1bb1c81 Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.880259 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.894219 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e0b637a335c19657d4d46fd86d8f4b832a3f18bcfd9bb64a3a31d843746321e0 WatchSource:0}: Error finding container e0b637a335c19657d4d46fd86d8f4b832a3f18bcfd9bb64a3a31d843746321e0: Status 404 returned error can't find the container with id e0b637a335c19657d4d46fd86d8f4b832a3f18bcfd9bb64a3a31d843746321e0 Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.911772 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ee643a2f08a8b054d1b17df6cc457adebdb9a45721a68aca46458555bb2a8531 WatchSource:0}: Error finding container ee643a2f08a8b054d1b17df6cc457adebdb9a45721a68aca46458555bb2a8531: Status 404 returned error can't find the container with id ee643a2f08a8b054d1b17df6cc457adebdb9a45721a68aca46458555bb2a8531 Oct 02 11:10:00 crc kubenswrapper[4929]: W1002 11:10:00.948489 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.948591 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.963867 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.965357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.965386 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.965398 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:00 crc kubenswrapper[4929]: I1002 11:10:00.965429 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:00 crc kubenswrapper[4929]: E1002 11:10:00.965812 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 02 11:10:01 crc kubenswrapper[4929]: W1002 11:10:01.040634 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:01 crc kubenswrapper[4929]: E1002 11:10:01.040813 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.071205 4929 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.165496 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"063e6d2e14f1c49945cd977331097c4cfee6ff6181154a3704277e109492ea65"} Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.167249 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"57271b7f720bbb4b2fdf75c44245c15625834e875072727cd47fb951746579a5"} Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.168476 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ee643a2f08a8b054d1b17df6cc457adebdb9a45721a68aca46458555bb2a8531"} Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.169473 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e0b637a335c19657d4d46fd86d8f4b832a3f18bcfd9bb64a3a31d843746321e0"} Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.170382 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3ebd0f258ad9ebcfea45f0ddad3e5e7064c9d3ba60ba0e920d7cf23bf1bb1c81"} Oct 02 11:10:01 crc kubenswrapper[4929]: W1002 11:10:01.334515 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:01 crc kubenswrapper[4929]: E1002 11:10:01.334615 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:01 crc kubenswrapper[4929]: E1002 11:10:01.490348 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Oct 02 11:10:01 crc kubenswrapper[4929]: W1002 11:10:01.522287 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:01 crc kubenswrapper[4929]: E1002 11:10:01.522382 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.765938 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.767386 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.767435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.767449 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:01 crc kubenswrapper[4929]: I1002 11:10:01.767475 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:01 crc kubenswrapper[4929]: E1002 11:10:01.768171 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.070699 4929 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.175587 4929 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e" exitCode=0 Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.175706 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.175705 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.177072 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.177105 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.177118 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.181657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.181689 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.181698 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.181701 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.181772 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.182546 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.182576 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.182586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.183326 4929 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4" exitCode=0 Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.183429 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.183728 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.184114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.184136 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.184144 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.185693 4929 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2dd45d8ab151d9808a6d669c117d3dfb9d6c15830cf8e8daaeafa607ffadbaa4" exitCode=0 Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.185744 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.185917 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2dd45d8ab151d9808a6d669c117d3dfb9d6c15830cf8e8daaeafa607ffadbaa4"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.186383 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.186435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.186457 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.188199 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.188904 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f" exitCode=0 Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.188935 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.188986 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f"} Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.189020 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.189038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.189048 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.190084 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.190107 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:02 crc kubenswrapper[4929]: I1002 11:10:02.190118 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:02 crc kubenswrapper[4929]: W1002 11:10:02.678613 4929 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:02 crc kubenswrapper[4929]: E1002 11:10:02.679055 4929 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.070209 4929 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 02 11:10:03 crc kubenswrapper[4929]: E1002 11:10:03.091969 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="3.2s" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.193739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.193809 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.195533 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.195570 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.195581 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.199224 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.199258 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.199285 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.199319 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.200427 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.200468 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.200482 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.202848 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.202884 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.202899 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.202914 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.209158 4929 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="834256c75ef6a6515680b23b486275cf497473404bf0640441348f54e0ad6718" exitCode=0 Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.209263 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.209670 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.209685 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"834256c75ef6a6515680b23b486275cf497473404bf0640441348f54e0ad6718"} Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211300 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211331 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211340 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211984 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.211992 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.368366 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.370712 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.370764 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.370775 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:03 crc kubenswrapper[4929]: I1002 11:10:03.370805 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:03 crc kubenswrapper[4929]: E1002 11:10:03.371931 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.214433 4929 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="de91f05ac52b7f32a9703d2eb911771deff9aa19e8d3f40d3e4879f4b288aca0" exitCode=0 Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.214528 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"de91f05ac52b7f32a9703d2eb911771deff9aa19e8d3f40d3e4879f4b288aca0"} Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.214607 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.217129 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.217164 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.217176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.220519 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519"} Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.220663 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.220686 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.220718 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.220686 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.221995 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222025 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222104 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222145 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222194 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222821 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222850 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:04 crc kubenswrapper[4929]: I1002 11:10:04.222862 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228571 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"71fb60d534194685485e82b93ce85a982783642ba9e1648cd5d977fa98070f8f"} Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228621 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3b01a6a93f19c3eaeffc75bd3b98dd285a42ad5a97d150d7979c95ebba78afaa"} Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228638 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30b7966d001502592833ef02ade1e97de0a1080bd224ea5e6d5473ae339a25a8"} Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228642 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228711 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.228651 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59699b879f0c274217ae04156102152594e2f465dac2b1bf1d0e92bdb3d00ad8"} Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.229702 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.229770 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.229788 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.320466 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.320688 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.322472 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.322543 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.322557 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.581626 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.581838 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.583105 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.583192 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.583216 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.813316 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.850121 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:05 crc kubenswrapper[4929]: I1002 11:10:05.857639 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.236483 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.237240 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.237240 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dbc8ab0a5c43e8cd381e1893a675f60f167c688df32270a0ef739e62b59f69f2"} Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.237906 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.237939 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.237951 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.238240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.238273 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.238289 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.239863 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.240047 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.240461 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.241984 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.242253 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.242280 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.307944 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.501539 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.573066 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.574594 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.574644 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.574662 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:06 crc kubenswrapper[4929]: I1002 11:10:06.574697 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.239384 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.239419 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.239448 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.239475 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.239654 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240722 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240754 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240779 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.240791 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.241297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.241337 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:07 crc kubenswrapper[4929]: I1002 11:10:07.241354 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.242339 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.243797 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.243881 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.243901 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.813364 4929 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 11:10:08 crc kubenswrapper[4929]: I1002 11:10:08.813652 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.627056 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.627262 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.628425 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.628460 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.628472 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.718247 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.718463 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.720171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.720280 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:09 crc kubenswrapper[4929]: I1002 11:10:09.720359 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:10 crc kubenswrapper[4929]: E1002 11:10:10.270830 4929 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 11:10:11 crc kubenswrapper[4929]: I1002 11:10:11.152567 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 11:10:11 crc kubenswrapper[4929]: I1002 11:10:11.152763 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:11 crc kubenswrapper[4929]: I1002 11:10:11.154168 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:11 crc kubenswrapper[4929]: I1002 11:10:11.154215 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:11 crc kubenswrapper[4929]: I1002 11:10:11.154226 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:13 crc kubenswrapper[4929]: I1002 11:10:13.942449 4929 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 11:10:13 crc kubenswrapper[4929]: I1002 11:10:13.942516 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 11:10:13 crc kubenswrapper[4929]: I1002 11:10:13.947674 4929 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 11:10:13 crc kubenswrapper[4929]: I1002 11:10:13.947718 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.248917 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.249265 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.251215 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.251288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.251312 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.256770 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.271948 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.272999 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.273038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.273050 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.361077 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.361410 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.363437 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.363579 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.363608 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:16 crc kubenswrapper[4929]: I1002 11:10:16.384343 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 11:10:17 crc kubenswrapper[4929]: I1002 11:10:17.274599 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:17 crc kubenswrapper[4929]: I1002 11:10:17.276043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:17 crc kubenswrapper[4929]: I1002 11:10:17.276114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:17 crc kubenswrapper[4929]: I1002 11:10:17.276134 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.814717 4929 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.814854 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.941040 4929 trace.go:236] Trace[429514499]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:10:04.176) (total time: 14764ms): Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[429514499]: ---"Objects listed" error: 14764ms (11:10:18.940) Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[429514499]: [14.764179105s] [14.764179105s] END Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.941589 4929 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.941896 4929 trace.go:236] Trace[1145959458]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:10:04.481) (total time: 14460ms): Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[1145959458]: ---"Objects listed" error: 14460ms (11:10:18.941) Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[1145959458]: [14.460331872s] [14.460331872s] END Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.941950 4929 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.944158 4929 trace.go:236] Trace[2045993077]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:10:08.321) (total time: 10622ms): Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[2045993077]: ---"Objects listed" error: 10622ms (11:10:18.944) Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[2045993077]: [10.622591032s] [10.622591032s] END Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.944210 4929 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.944231 4929 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.944419 4929 trace.go:236] Trace[219036558]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 11:10:04.148) (total time: 14795ms): Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[219036558]: ---"Objects listed" error: 14795ms (11:10:18.944) Oct 02 11:10:18 crc kubenswrapper[4929]: Trace[219036558]: [14.795748329s] [14.795748329s] END Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.944447 4929 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 11:10:18 crc kubenswrapper[4929]: E1002 11:10:18.945585 4929 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 11:10:18 crc kubenswrapper[4929]: E1002 11:10:18.947199 4929 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.972622 4929 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48672->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.972665 4929 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48684->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.972736 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48684->192.168.126.11:17697: read: connection reset by peer" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.972677 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48672->192.168.126.11:17697: read: connection reset by peer" Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.973149 4929 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 11:10:18 crc kubenswrapper[4929]: I1002 11:10:18.973175 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.063190 4929 apiserver.go:52] "Watching apiserver" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.067229 4929 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.067540 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.067886 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.068197 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.068305 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.068417 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.068427 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.068441 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.068832 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.068460 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.068902 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.070507 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.071116 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.071260 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.071524 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.071706 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.071594 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.072180 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.072351 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.073312 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.085973 4929 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.097564 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.110128 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.124262 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.133743 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.145298 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.145730 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.145668 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.145824 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.145848 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146301 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146889 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147324 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146927 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146224 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147746 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147924 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146494 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146686 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.146794 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147987 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147251 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.147686 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148162 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148353 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148059 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148400 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148418 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148419 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148436 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148483 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148507 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148527 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148570 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148586 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148603 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148620 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148636 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148652 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148668 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148684 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148699 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148747 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148764 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148781 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148781 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148799 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148819 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148887 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148909 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148924 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148939 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148976 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.148999 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149037 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149071 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149086 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149103 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149118 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149134 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149170 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149180 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149188 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149229 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149254 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149288 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149351 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149390 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149431 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149467 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149504 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149505 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149569 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149595 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149615 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149634 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149648 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149656 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149658 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149865 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149892 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149934 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149986 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150016 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150046 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150070 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150096 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150121 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150146 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150172 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150199 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150222 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150248 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150274 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150386 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150437 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150471 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150505 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150535 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150578 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150611 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150644 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150675 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150718 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150809 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150864 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150890 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150917 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150945 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150997 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151048 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151072 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151098 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151121 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151147 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151172 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151203 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151231 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151260 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151289 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151325 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151360 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151393 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151428 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151458 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151489 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151525 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151621 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151661 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151695 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151728 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151760 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151807 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151852 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151887 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151925 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151982 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152011 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152040 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152072 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152099 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152218 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152258 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152292 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152327 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152369 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152403 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152452 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152490 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152522 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152547 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152573 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152601 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152627 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152654 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152683 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152714 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152743 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152771 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.149879 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150892 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150924 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.150944 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151220 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152909 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151403 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151502 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151592 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.151746 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152012 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153106 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152036 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152138 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152207 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152349 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152716 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152772 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153227 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153332 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153568 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153449 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.153802 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154065 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154501 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154537 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154575 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154626 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.154814 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:19.654783215 +0000 UTC m=+20.205149789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.154983 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155037 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155060 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155304 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155661 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155385 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155770 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.155814 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156046 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156183 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156219 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156264 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156382 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156577 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.156596 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.157048 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.157040 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.157068 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.157589 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.157729 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.158586 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.158851 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.158869 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.158901 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159043 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159571 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159585 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159718 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159751 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159888 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.159916 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160073 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160281 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160624 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160647 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160695 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.160813 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161092 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161154 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161272 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161328 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161492 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161692 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161712 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161840 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.161897 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162170 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162191 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162417 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.152801 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162583 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162612 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162635 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162653 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162673 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162692 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162700 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162711 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162732 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162776 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162801 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162819 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162853 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162870 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162886 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162903 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162921 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162939 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162970 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.162990 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163010 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163030 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163051 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163073 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163097 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163117 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163135 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163157 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163174 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163195 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163233 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163251 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163271 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163290 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163308 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163327 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163344 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163363 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163381 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163400 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163420 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163440 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163459 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163476 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163493 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163511 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163529 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163546 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163564 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163581 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163598 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163614 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163696 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163721 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163742 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163068 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163158 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163226 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163259 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163386 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163428 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163624 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.163966 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164042 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164297 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164505 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164682 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164704 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164727 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164748 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164771 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164794 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164812 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164833 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164855 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164940 4929 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164953 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164988 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164999 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165009 4929 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165019 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165028 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165039 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165049 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165058 4929 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165067 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165077 4929 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165086 4929 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165095 4929 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165105 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165114 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165123 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165133 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165143 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165152 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165161 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165171 4929 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165181 4929 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165192 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165204 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165213 4929 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165222 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165231 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165241 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165251 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165259 4929 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165268 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165277 4929 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165286 4929 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165295 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165305 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165314 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165323 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165332 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165341 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165350 4929 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165359 4929 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165369 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165378 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165387 4929 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165397 4929 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165405 4929 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165415 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165424 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165434 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165442 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165451 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165461 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165470 4929 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.167930 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165488 4929 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168239 4929 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168257 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168275 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168292 4929 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168306 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168317 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168336 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168347 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168359 4929 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168370 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168381 4929 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168392 4929 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168402 4929 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168412 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168422 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168432 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168442 4929 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168458 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168467 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168477 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168487 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168497 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168506 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168516 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168526 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168539 4929 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168553 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168566 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168579 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168592 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168605 4929 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168618 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168631 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168651 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168662 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168674 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168683 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168695 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168708 4929 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168721 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168733 4929 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168745 4929 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168759 4929 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168771 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168812 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168824 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168838 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168851 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168865 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168877 4929 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168907 4929 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.168917 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.164628 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165060 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165102 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165476 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165547 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165552 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165679 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165706 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165703 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.165908 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166059 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166150 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166277 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166300 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166332 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166617 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166636 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.166977 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.167312 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.170781 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.171318 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:19.671289623 +0000 UTC m=+20.221655987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.171373 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.171406 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:19.671399446 +0000 UTC m=+20.221765810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.171602 4929 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.172319 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.173083 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.173078 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.173192 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.173207 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.173538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.174298 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.179090 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.183973 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.184171 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.187035 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.187223 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.187345 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.189998 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.191340 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.191367 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.191385 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.191453 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:19.69143124 +0000 UTC m=+20.241797824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.191510 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.191942 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.192385 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.192461 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.192708 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.192738 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.192753 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.192817 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:19.692801227 +0000 UTC m=+20.243167811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.193648 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.193703 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.193734 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.193862 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.194592 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.194606 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.194804 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.195546 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.201350 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.201544 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.202287 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.203014 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.203194 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.203251 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.203908 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.203928 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.204039 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.204468 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.204601 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.204915 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205053 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205065 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205097 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205464 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205717 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.205993 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206049 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206063 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206243 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206436 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206853 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.206853 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.207410 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.207479 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.207570 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209110 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209204 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209340 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209378 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209456 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209888 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209475 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209526 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.209544 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210253 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210330 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210494 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210646 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210659 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.210672 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.213108 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.213907 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.213948 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.214091 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.214158 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.214524 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.216413 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.227668 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.233939 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.235660 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.269955 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270059 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270134 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270151 4929 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270166 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270179 4929 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270191 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270204 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270218 4929 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270233 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270246 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270260 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270273 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270286 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270299 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270313 4929 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270325 4929 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270337 4929 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270349 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270362 4929 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270374 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270386 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270398 4929 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270411 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270426 4929 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270440 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270454 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270467 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270480 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270493 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270506 4929 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270521 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270535 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270548 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270560 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270573 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270586 4929 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270600 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270612 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270624 4929 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270638 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270650 4929 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270663 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270676 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270688 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270703 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270716 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270730 4929 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270742 4929 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270755 4929 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270770 4929 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270783 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270823 4929 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270838 4929 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270851 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270866 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270878 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270891 4929 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270906 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270919 4929 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270931 4929 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270944 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270957 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.270989 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271005 4929 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271018 4929 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271031 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271044 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271058 4929 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271071 4929 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271083 4929 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271096 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271109 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271123 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271136 4929 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271148 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271161 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271175 4929 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271188 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271202 4929 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271215 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271230 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271243 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271256 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271268 4929 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271282 4929 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271303 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271316 4929 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.271329 4929 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.272168 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.273195 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.284897 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.287893 4929 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519" exitCode=255 Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.287942 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519"} Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.301731 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.302262 4929 scope.go:117] "RemoveContainer" containerID="c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.302565 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.312458 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.327240 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.336132 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.347047 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.357391 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.385585 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.393681 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.398473 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 11:10:19 crc kubenswrapper[4929]: W1002 11:10:19.398615 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d4bbb21539ea13af585ae1092dbe9b75596e12d52562c24c1afea0bfb9b8fc5b WatchSource:0}: Error finding container d4bbb21539ea13af585ae1092dbe9b75596e12d52562c24c1afea0bfb9b8fc5b: Status 404 returned error can't find the container with id d4bbb21539ea13af585ae1092dbe9b75596e12d52562c24c1afea0bfb9b8fc5b Oct 02 11:10:19 crc kubenswrapper[4929]: W1002 11:10:19.413198 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0148f40a2ab33249e824645d606d327d0517ac0b6dce53b4cc023363893a3029 WatchSource:0}: Error finding container 0148f40a2ab33249e824645d606d327d0517ac0b6dce53b4cc023363893a3029: Status 404 returned error can't find the container with id 0148f40a2ab33249e824645d606d327d0517ac0b6dce53b4cc023363893a3029 Oct 02 11:10:19 crc kubenswrapper[4929]: W1002 11:10:19.414601 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-9046dd928ec4a9da88a096d6bc03e1f7bf4d7e9900970d0c75ec3f72658497b5 WatchSource:0}: Error finding container 9046dd928ec4a9da88a096d6bc03e1f7bf4d7e9900970d0c75ec3f72658497b5: Status 404 returned error can't find the container with id 9046dd928ec4a9da88a096d6bc03e1f7bf4d7e9900970d0c75ec3f72658497b5 Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.674370 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.674437 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.674470 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.674548 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.674588 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:20.674542272 +0000 UTC m=+21.224908646 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.674646 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:20.674630284 +0000 UTC m=+21.224996658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.674702 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.674853 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:20.674798599 +0000 UTC m=+21.225164963 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.723751 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.734231 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.734581 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.744225 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.753319 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.772025 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.775024 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.775076 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.775263 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.775287 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.775307 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.775377 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:20.775349918 +0000 UTC m=+21.325716292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.776267 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.776330 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.776351 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: E1002 11:10:19.776451 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:20.776413817 +0000 UTC m=+21.326780201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.790508 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.805087 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:19 crc kubenswrapper[4929]: I1002 11:10:19.818220 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.160916 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.161863 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.163483 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.164429 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.165928 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.166815 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.167435 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.168017 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.168607 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.169127 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.169623 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.170288 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.170800 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.172310 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.172894 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.173532 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.174392 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.176667 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.177484 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.178843 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.179573 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.180524 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.181634 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.182441 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.183524 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.184162 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.185185 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.185651 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.187129 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.200119 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.202699 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.205015 4929 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.206158 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.209760 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.211077 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.211752 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.212726 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.216491 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.218816 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.226331 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.227069 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.227459 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.228261 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.228779 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.229411 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.230191 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.230835 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.231318 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.231861 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.232372 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.233221 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.233726 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.235017 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.235542 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.236211 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.236812 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.237338 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.244893 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.258206 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.285621 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.292801 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.292858 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9046dd928ec4a9da88a096d6bc03e1f7bf4d7e9900970d0c75ec3f72658497b5"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.294843 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.294893 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.294905 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0148f40a2ab33249e824645d606d327d0517ac0b6dce53b4cc023363893a3029"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.299381 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d4bbb21539ea13af585ae1092dbe9b75596e12d52562c24c1afea0bfb9b8fc5b"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.301427 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.306394 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff"} Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.307637 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.328069 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.345524 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.365596 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.385747 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.399583 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.413443 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.426395 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.441939 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.462198 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.684249 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.684325 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.684355 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.684425 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.684476 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:22.684462461 +0000 UTC m=+23.234828825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.684521 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.684606 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:22.684586775 +0000 UTC m=+23.234953139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.684686 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:22.684678957 +0000 UTC m=+23.235045321 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.785126 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:20 crc kubenswrapper[4929]: I1002 11:10:20.785196 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785382 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785426 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785440 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785507 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:22.785488773 +0000 UTC m=+23.335855127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785935 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.785990 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.786003 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:20 crc kubenswrapper[4929]: E1002 11:10:20.786063 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:22.786045638 +0000 UTC m=+23.336412002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:21 crc kubenswrapper[4929]: I1002 11:10:21.156092 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:21 crc kubenswrapper[4929]: E1002 11:10:21.156297 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:21 crc kubenswrapper[4929]: I1002 11:10:21.156370 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:21 crc kubenswrapper[4929]: I1002 11:10:21.156382 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:21 crc kubenswrapper[4929]: E1002 11:10:21.157018 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:21 crc kubenswrapper[4929]: E1002 11:10:21.157183 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:21 crc kubenswrapper[4929]: I1002 11:10:21.308928 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.315992 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11"} Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.337686 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.360319 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.380777 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.398594 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.418698 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.438289 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.457756 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.475608 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:22Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.707219 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.707300 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.707336 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.707425 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.707491 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:26.707472948 +0000 UTC m=+27.257839312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.707556 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.707606 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:26.70754889 +0000 UTC m=+27.257915294 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.707677 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:26.707662323 +0000 UTC m=+27.258028727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.808666 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:22 crc kubenswrapper[4929]: I1002 11:10:22.808728 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.808875 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.808893 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.808910 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.808991 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.809045 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.809063 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.809001 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:26.808982553 +0000 UTC m=+27.359348927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:22 crc kubenswrapper[4929]: E1002 11:10:22.809173 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:26.809143417 +0000 UTC m=+27.359509801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:23 crc kubenswrapper[4929]: I1002 11:10:23.156084 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:23 crc kubenswrapper[4929]: I1002 11:10:23.156144 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:23 crc kubenswrapper[4929]: I1002 11:10:23.156206 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:23 crc kubenswrapper[4929]: E1002 11:10:23.156357 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:23 crc kubenswrapper[4929]: E1002 11:10:23.156505 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:23 crc kubenswrapper[4929]: E1002 11:10:23.156631 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.156149 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.156157 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.156173 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.156300 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.156552 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.156617 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.255107 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-q4fb6"] Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.255493 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.258057 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.258715 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.258780 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.275736 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.307566 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.345996 4929 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.351035 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.351078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.351096 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.351161 4929 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.372458 4929 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.372795 4929 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.373809 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.373841 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.373854 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.373870 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.373882 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.402719 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.428492 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-hosts-file\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.428543 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-kube-api-access-wwkb7\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.428968 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.433350 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.433383 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.433394 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.433415 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.433452 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.438152 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.454807 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.462010 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.466027 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.466065 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.466075 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.466097 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.466108 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.471308 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.480417 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.484088 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.484122 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.484133 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.484148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.484159 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.485179 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.496438 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.497194 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.500745 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.500771 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.500783 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.500803 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.500813 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.510215 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.512317 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: E1002 11:10:25.512430 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.514658 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.514695 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.514705 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.514722 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.514733 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.529162 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-hosts-file\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.529208 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-kube-api-access-wwkb7\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.529323 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-hosts-file\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.548542 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/ce61e3b0-e445-41c1-be86-ac3e51cffbe1-kube-api-access-wwkb7\") pod \"node-resolver-q4fb6\" (UID: \"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\") " pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.570277 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q4fb6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.619188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.619270 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.619298 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.619331 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.619354 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.650878 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kxz86"] Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.651730 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5fzl7"] Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.652477 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.654418 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gbz4b"] Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.654619 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.654732 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8j488"] Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.656238 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.657250 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.658229 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.658558 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.658797 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.659625 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.662328 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.662660 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.662730 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.662985 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.663012 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.663132 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.663709 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668191 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668268 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668269 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668509 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668661 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.668918 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.672219 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.672584 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.689262 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.701652 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.714359 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.722515 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.722556 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.722565 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.722582 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.722592 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.733530 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.746947 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.762903 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.774872 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.786170 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.796822 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.807252 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.817846 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.821017 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.821289 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.824757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.824830 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.824841 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.824861 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.824875 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831486 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-cni-binary-copy\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831516 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831539 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831597 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831651 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831691 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831720 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-cnibin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831742 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-multus-daemon-config\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831764 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-os-release\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831784 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4b5329-0385-4f39-9d63-70284421e448-mcd-auth-proxy-config\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831858 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831877 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkltr\" (UniqueName: \"kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831899 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-system-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831919 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831934 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.831996 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832019 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-os-release\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832039 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-conf-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832061 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-etc-kubernetes\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832110 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hpmm\" (UniqueName: \"kubernetes.io/projected/1b4b5329-0385-4f39-9d63-70284421e448-kube-api-access-8hpmm\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832143 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832186 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-socket-dir-parent\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832210 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b4b5329-0385-4f39-9d63-70284421e448-rootfs\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832244 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832283 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrfbp\" (UniqueName: \"kubernetes.io/projected/840bd011-2ac2-422e-adc5-5de6c717fd54-kube-api-access-xrfbp\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832307 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832346 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.832375 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-multus\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833011 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-kubelet\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833102 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-system-cni-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833137 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-hostroot\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833188 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833219 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833297 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4b5329-0385-4f39-9d63-70284421e448-proxy-tls\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833366 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-cnibin\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833428 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833479 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833507 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833560 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pxn\" (UniqueName: \"kubernetes.io/projected/4599e863-12c0-4c39-a873-a46012459555-kube-api-access-d6pxn\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833650 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-multus-certs\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833741 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-bin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833811 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-k8s-cni-cncf-io\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833859 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-netns\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833932 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.833972 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.834024 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.845697 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.857718 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.874216 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.886413 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.897378 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.908870 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.920370 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.927556 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.927588 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.927597 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.927611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.927620 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:25Z","lastTransitionTime":"2025-10-02T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.931155 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-hostroot\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935289 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935306 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935325 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4b5329-0385-4f39-9d63-70284421e448-proxy-tls\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-cnibin\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935362 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935377 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935400 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-hostroot\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935410 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935481 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pxn\" (UniqueName: \"kubernetes.io/projected/4599e863-12c0-4c39-a873-a46012459555-kube-api-access-d6pxn\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935536 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-multus-certs\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935560 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-bin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-k8s-cni-cncf-io\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935660 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-netns\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935716 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935740 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935767 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935791 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-cni-binary-copy\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935819 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935843 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935867 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935893 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935918 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935944 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-cnibin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.935987 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-multus-daemon-config\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936012 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-os-release\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936021 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936077 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936077 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4b5329-0385-4f39-9d63-70284421e448-mcd-auth-proxy-config\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936110 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936125 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkltr\" (UniqueName: \"kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936145 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-system-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936161 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936176 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936192 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936219 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936234 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-os-release\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936248 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-conf-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936263 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-etc-kubernetes\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936281 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hpmm\" (UniqueName: \"kubernetes.io/projected/1b4b5329-0385-4f39-9d63-70284421e448-kube-api-access-8hpmm\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936296 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936321 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-socket-dir-parent\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b4b5329-0385-4f39-9d63-70284421e448-rootfs\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936356 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936372 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrfbp\" (UniqueName: \"kubernetes.io/projected/840bd011-2ac2-422e-adc5-5de6c717fd54-kube-api-access-xrfbp\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936388 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936411 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936425 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-multus\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936439 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-kubelet\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936453 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-system-cni-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936498 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-system-cni-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936523 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936837 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b4b5329-0385-4f39-9d63-70284421e448-mcd-auth-proxy-config\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.936979 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-cnibin\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937006 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-conf-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937060 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-etc-kubernetes\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937067 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937098 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937155 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-multus-certs\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937191 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-bin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937223 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-k8s-cni-cncf-io\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937260 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-run-netns\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937293 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937383 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937432 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-socket-dir-parent\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937458 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1b4b5329-0385-4f39-9d63-70284421e448-rootfs\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937769 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937951 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-cni-multus\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938007 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938069 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-host-var-lib-kubelet\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938121 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.937983 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-system-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938175 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-os-release\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938223 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-os-release\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938252 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-binary-copy\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938282 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-cnibin\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938310 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938510 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4599e863-12c0-4c39-a873-a46012459555-multus-cni-dir\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938553 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938650 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/840bd011-2ac2-422e-adc5-5de6c717fd54-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938713 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938803 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938821 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-multus-daemon-config\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.938862 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.940508 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4599e863-12c0-4c39-a873-a46012459555-cni-binary-copy\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.942166 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.942256 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/840bd011-2ac2-422e-adc5-5de6c717fd54-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.942321 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.944850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b4b5329-0385-4f39-9d63-70284421e448-proxy-tls\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.944954 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.955594 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.980339 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkltr\" (UniqueName: \"kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr\") pod \"ovnkube-node-5fzl7\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.982642 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hpmm\" (UniqueName: \"kubernetes.io/projected/1b4b5329-0385-4f39-9d63-70284421e448-kube-api-access-8hpmm\") pod \"machine-config-daemon-8j488\" (UID: \"1b4b5329-0385-4f39-9d63-70284421e448\") " pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.989159 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pxn\" (UniqueName: \"kubernetes.io/projected/4599e863-12c0-4c39-a873-a46012459555-kube-api-access-d6pxn\") pod \"multus-gbz4b\" (UID: \"4599e863-12c0-4c39-a873-a46012459555\") " pod="openshift-multus/multus-gbz4b" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.989368 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.992093 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrfbp\" (UniqueName: \"kubernetes.io/projected/840bd011-2ac2-422e-adc5-5de6c717fd54-kube-api-access-xrfbp\") pod \"multus-additional-cni-plugins-kxz86\" (UID: \"840bd011-2ac2-422e-adc5-5de6c717fd54\") " pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:25 crc kubenswrapper[4929]: I1002 11:10:25.994756 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.001221 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:25Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: W1002 11:10:26.009427 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5862ad0e_b703_4706_a7b4_25e4fdf5f78e.slice/crio-4cd6b6c8789c6571117bbaf188272d65f466116dd57d1acea20cfebca7c30f33 WatchSource:0}: Error finding container 4cd6b6c8789c6571117bbaf188272d65f466116dd57d1acea20cfebca7c30f33: Status 404 returned error can't find the container with id 4cd6b6c8789c6571117bbaf188272d65f466116dd57d1acea20cfebca7c30f33 Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.014228 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.026426 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.030387 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.030543 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.030601 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.030660 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.030787 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.039142 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.052466 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.062682 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.077704 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.090129 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.102934 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.114200 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.127342 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.133715 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.133754 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.133767 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.133784 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.133797 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.138616 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.151802 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.172065 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.182239 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.236305 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.236356 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.236368 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.236384 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.236394 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.276382 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kxz86" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.282550 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gbz4b" Oct 02 11:10:26 crc kubenswrapper[4929]: W1002 11:10:26.286399 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840bd011_2ac2_422e_adc5_5de6c717fd54.slice/crio-2a5a389a8a1fbd09c313783b5c6c9b0ebe2f25761411cbd672545792b6ca82a0 WatchSource:0}: Error finding container 2a5a389a8a1fbd09c313783b5c6c9b0ebe2f25761411cbd672545792b6ca82a0: Status 404 returned error can't find the container with id 2a5a389a8a1fbd09c313783b5c6c9b0ebe2f25761411cbd672545792b6ca82a0 Oct 02 11:10:26 crc kubenswrapper[4929]: W1002 11:10:26.297607 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4599e863_12c0_4c39_a873_a46012459555.slice/crio-f8a3ce1b052848789ad17b743d4d9645fe6f6b432b28d731705461561acddb45 WatchSource:0}: Error finding container f8a3ce1b052848789ad17b743d4d9645fe6f6b432b28d731705461561acddb45: Status 404 returned error can't find the container with id f8a3ce1b052848789ad17b743d4d9645fe6f6b432b28d731705461561acddb45 Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.335788 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerStarted","Data":"2a5a389a8a1fbd09c313783b5c6c9b0ebe2f25761411cbd672545792b6ca82a0"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.338862 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.338908 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.338918 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.338935 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.338945 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.339490 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054" exitCode=0 Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.339534 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.339565 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"4cd6b6c8789c6571117bbaf188272d65f466116dd57d1acea20cfebca7c30f33"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.340473 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerStarted","Data":"f8a3ce1b052848789ad17b743d4d9645fe6f6b432b28d731705461561acddb45"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.342826 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.342898 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.342911 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"49d406419af3103f5e09f12278d6cbceb57ed06596de5c278fbfe4e220de500c"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.345420 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q4fb6" event={"ID":"ce61e3b0-e445-41c1-be86-ac3e51cffbe1","Type":"ContainerStarted","Data":"36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.345535 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q4fb6" event={"ID":"ce61e3b0-e445-41c1-be86-ac3e51cffbe1","Type":"ContainerStarted","Data":"9cd571a805158c012c8edde60670c8ddae99e2b0fc72ca6ca908463b4a256008"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.361910 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.374772 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.390143 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.403691 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.424075 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.439032 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.446309 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.446359 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.446376 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.446402 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.446420 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.450706 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.462233 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.474708 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.487694 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.501447 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.516497 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.528503 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.542821 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.551174 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.551214 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.551224 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.551243 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.551253 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.560164 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.578706 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.592592 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.605110 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.617335 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.627624 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.642355 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.653579 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.653611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.653620 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.653634 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.653643 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.656003 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.670286 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.684798 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.697092 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.706209 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:26Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.743909 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.744017 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.744043 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.744129 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.744181 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:34.744166946 +0000 UTC m=+35.294533310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.744506 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:34.744496775 +0000 UTC m=+35.294863139 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.744571 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.744593 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:34.744587598 +0000 UTC m=+35.294953962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.756179 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.756225 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.756237 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.756256 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.756267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.845495 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.845851 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.845896 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.845910 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.845989 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:34.845952389 +0000 UTC m=+35.396318843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.845990 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.845862 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.846016 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.846092 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:26 crc kubenswrapper[4929]: E1002 11:10:26.846123 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:34.846114963 +0000 UTC m=+35.396481327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.859823 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.859864 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.859873 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.859888 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.859899 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.966408 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.966450 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.966461 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.966483 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:26 crc kubenswrapper[4929]: I1002 11:10:26.966495 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:26Z","lastTransitionTime":"2025-10-02T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.069840 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.069884 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.069894 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.069913 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.069925 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.157071 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:27 crc kubenswrapper[4929]: E1002 11:10:27.157219 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.157526 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:27 crc kubenswrapper[4929]: E1002 11:10:27.157600 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.157639 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:27 crc kubenswrapper[4929]: E1002 11:10:27.157680 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.172312 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.172337 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.172347 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.172360 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.172371 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.274930 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.274978 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.274987 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.275002 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.275011 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.349916 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerStarted","Data":"91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.351024 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821" exitCode=0 Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.351102 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355503 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355546 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355562 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355575 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.355601 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.367931 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.376689 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.377114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.377128 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.377148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.377161 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.388270 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.397094 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.412111 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.435090 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.448789 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.461184 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.474173 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.480335 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.480408 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.480421 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.480437 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.480447 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.485225 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.496386 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.507184 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.523829 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.542909 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.556636 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.571503 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.583001 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.583032 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.583041 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.583056 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.583067 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.584993 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.595745 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.613686 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.628693 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.641466 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.654649 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.667657 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.677565 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.685624 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.685683 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.685701 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.685728 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.685746 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.696510 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.713249 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.725352 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.788248 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.788295 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.788308 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.788327 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.788340 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.863906 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7hr2m"] Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.864461 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.866606 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.866693 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.867363 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.867464 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.881456 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.892622 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.892675 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.892687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.892717 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.892733 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.894290 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.929623 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.956909 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr75p\" (UniqueName: \"kubernetes.io/projected/6013d401-6138-4c35-9a72-00a269b5c765-kube-api-access-jr75p\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.957125 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6013d401-6138-4c35-9a72-00a269b5c765-host\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.957256 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6013d401-6138-4c35-9a72-00a269b5c765-serviceca\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.970871 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:27Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.996206 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.996724 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.996812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.996901 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:27 crc kubenswrapper[4929]: I1002 11:10:27.997010 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:27Z","lastTransitionTime":"2025-10-02T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.004547 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.044020 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.057947 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr75p\" (UniqueName: \"kubernetes.io/projected/6013d401-6138-4c35-9a72-00a269b5c765-kube-api-access-jr75p\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.058004 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6013d401-6138-4c35-9a72-00a269b5c765-host\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.058034 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6013d401-6138-4c35-9a72-00a269b5c765-serviceca\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.058208 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6013d401-6138-4c35-9a72-00a269b5c765-host\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.058929 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6013d401-6138-4c35-9a72-00a269b5c765-serviceca\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.094827 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr75p\" (UniqueName: \"kubernetes.io/projected/6013d401-6138-4c35-9a72-00a269b5c765-kube-api-access-jr75p\") pod \"node-ca-7hr2m\" (UID: \"6013d401-6138-4c35-9a72-00a269b5c765\") " pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.099716 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.099882 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.099945 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.100029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.100088 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.106823 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.145564 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.176507 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7hr2m" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.189585 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.202228 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.202267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.202277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.202293 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.202304 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.226068 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.266689 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.304524 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.308445 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.308486 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.308499 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.308515 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.308524 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.345446 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.359069 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7hr2m" event={"ID":"6013d401-6138-4c35-9a72-00a269b5c765","Type":"ContainerStarted","Data":"283d05df8b3b1b85bdd58ca12abf1a870598b3490cfd64acd6cb21f0d1fc79f1"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.360713 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b" exitCode=0 Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.361572 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.387363 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.412201 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.412244 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.412259 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.412277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.412287 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.426819 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.469596 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.506674 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.515592 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.515628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.515640 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.515657 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.515669 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.546463 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.588000 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.622549 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.622595 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.622604 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.622618 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.622629 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.630802 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.665633 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.705665 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.724705 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.724745 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.724757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.724772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.724834 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.748016 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.787644 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.826035 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.827130 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.827154 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.827163 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.827176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.827186 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.864515 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.906874 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.931772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.932100 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.932114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.932134 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.932150 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:28Z","lastTransitionTime":"2025-10-02T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:28 crc kubenswrapper[4929]: I1002 11:10:28.962051 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:28Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.034621 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.034667 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.034677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.034696 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.034705 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.137524 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.137553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.137563 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.137578 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.137588 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.156085 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.156106 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.156151 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:29 crc kubenswrapper[4929]: E1002 11:10:29.156205 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:29 crc kubenswrapper[4929]: E1002 11:10:29.156288 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:29 crc kubenswrapper[4929]: E1002 11:10:29.156350 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.240067 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.240113 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.240126 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.240143 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.240153 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.343011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.343049 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.343060 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.343077 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.343089 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.367134 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.368289 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7hr2m" event={"ID":"6013d401-6138-4c35-9a72-00a269b5c765","Type":"ContainerStarted","Data":"27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.370406 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b" exitCode=0 Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.370463 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.377732 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.391300 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.409345 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.422155 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.434667 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.445606 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.445672 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.445687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.445708 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.445722 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.446357 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.461481 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.471546 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.483397 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.495051 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.507270 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.516568 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.527313 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.547076 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.548322 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.548352 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.548362 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.548378 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.548389 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.573510 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.584032 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.627163 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.630770 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.651715 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.651741 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.651750 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.651762 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.651771 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.667572 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.710544 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.746716 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.754598 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.754641 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.754658 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.754680 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.754697 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.785473 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.832434 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.856985 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.857027 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.857036 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.857052 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.857062 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.865944 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.906943 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.945455 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.960030 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.960061 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.960070 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.960087 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.960097 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:29Z","lastTransitionTime":"2025-10-02T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:29 crc kubenswrapper[4929]: I1002 11:10:29.988979 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:29Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.026571 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.064934 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.064988 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.064999 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.065015 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.065028 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.066173 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.112746 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.152911 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.167180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.167278 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.167288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.167302 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.167320 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.185896 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.224559 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.269069 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.270200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.270250 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.270272 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.270298 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.270322 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.304782 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.352675 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.373901 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.373944 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.373974 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.373992 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.374032 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.381094 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d" exitCode=0 Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.381308 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.388666 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.429914 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.465585 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.476678 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.476720 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.476733 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.476750 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.476760 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.507235 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.549126 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.580062 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.580109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.580120 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.580139 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.580154 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.586463 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.629952 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.666926 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.682225 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.682274 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.682285 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.682305 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.682318 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.707519 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.745847 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.784714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.784764 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.784777 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.784796 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.784811 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.790531 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.842719 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.869288 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.886847 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.886882 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.886891 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.886905 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.886915 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.910535 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.947108 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.986837 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:30Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.989518 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.989544 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.989553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.989568 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:30 crc kubenswrapper[4929]: I1002 11:10:30.989577 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:30Z","lastTransitionTime":"2025-10-02T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.023464 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.063860 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.092694 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.092764 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.092783 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.092813 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.092832 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.108116 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.147116 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.155707 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.155784 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:31 crc kubenswrapper[4929]: E1002 11:10:31.155848 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.155987 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:31 crc kubenswrapper[4929]: E1002 11:10:31.156093 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:31 crc kubenswrapper[4929]: E1002 11:10:31.156173 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.185289 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.195726 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.195774 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.195784 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.195800 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.195809 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.225305 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.267309 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.299681 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.299727 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.299747 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.299764 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.299775 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.308836 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.346685 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.385734 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.387811 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812" exitCode=0 Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.387869 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.395074 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.395886 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.395943 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.402358 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.402382 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.402391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.402404 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.402413 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.415177 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.416001 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.423851 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.465805 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.508204 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.508512 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.508525 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.508544 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.508556 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.512496 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.548752 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.590424 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.610859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.610896 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.610906 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.610921 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.610932 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.630223 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.671547 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.709872 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.713660 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.713705 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.713717 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.713736 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.713747 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.744205 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.786979 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.816472 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.816512 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.816521 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.816538 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.816548 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.825176 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.875116 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.909210 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.919843 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.919915 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.919951 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.920012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.920037 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:31Z","lastTransitionTime":"2025-10-02T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.945112 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:31 crc kubenswrapper[4929]: I1002 11:10:31.989069 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:31Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.022653 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.022765 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.022778 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.022796 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.022809 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.026939 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.066701 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.110241 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.126158 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.126207 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.126221 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.126242 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.126257 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.148908 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.185665 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.229238 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.229282 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.229293 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.229312 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.229327 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.232856 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.273690 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.306678 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.332300 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.332343 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.332374 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.332392 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.332403 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.402806 4929 generic.go:334] "Generic (PLEG): container finished" podID="840bd011-2ac2-422e-adc5-5de6c717fd54" containerID="2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2" exitCode=0 Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.402988 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.403278 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerDied","Data":"2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.416701 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.435493 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.435535 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.435548 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.435569 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.435582 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.439155 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.449990 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.464580 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.508587 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.538874 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.538917 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.538928 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.538947 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.538973 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.548897 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.584334 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.626391 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.640664 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.640695 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.640706 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.640722 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.640733 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.669170 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.707552 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.744171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.744752 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.744881 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.744992 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.745114 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.746231 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.786536 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.827769 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.848687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.848732 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.848745 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.848762 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.848774 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.865110 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:32Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.951883 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.952009 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.952029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.952056 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:32 crc kubenswrapper[4929]: I1002 11:10:32.952080 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:32Z","lastTransitionTime":"2025-10-02T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.055574 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.055615 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.055624 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.055638 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.055647 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.156287 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:33 crc kubenswrapper[4929]: E1002 11:10:33.156399 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.156588 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:33 crc kubenswrapper[4929]: E1002 11:10:33.156661 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.156901 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:33 crc kubenswrapper[4929]: E1002 11:10:33.157324 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.158809 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.158853 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.158863 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.158879 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.158888 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.261331 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.261363 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.261373 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.261390 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.261401 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.364076 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.364114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.364123 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.364138 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.364148 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.410581 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" event={"ID":"840bd011-2ac2-422e-adc5-5de6c717fd54","Type":"ContainerStarted","Data":"f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.410689 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.431059 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.452845 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.466684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.466740 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.466757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.466818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.466837 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.467214 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.482979 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.493435 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.509635 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.521466 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.531723 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.542175 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.554502 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.566504 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.569233 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.569276 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.569289 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.569307 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.569319 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.577565 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.594909 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.609310 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:33Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.672273 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.672340 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.672363 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.672391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.672413 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.774377 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.774418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.774428 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.774444 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.774455 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.876702 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.876737 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.876751 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.876770 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.876780 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.979180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.979513 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.979619 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.979753 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:33 crc kubenswrapper[4929]: I1002 11:10:33.979866 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:33Z","lastTransitionTime":"2025-10-02T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.082540 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.082784 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.082858 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.082971 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.083092 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.185514 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.185569 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.185585 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.185608 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.185625 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.288596 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.288841 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.288922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.289009 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.289068 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.396900 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.396941 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.396951 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.396982 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.396994 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.500365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.500393 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.500402 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.500418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.500427 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.603143 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.603180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.603191 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.603207 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.603219 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.705685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.705761 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.705780 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.705813 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.705832 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.808839 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.808991 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.809006 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.809023 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.809033 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.821290 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.821465 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:10:50.821442498 +0000 UTC m=+51.371808912 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.821546 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.821585 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.821705 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.821754 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:50.821744966 +0000 UTC m=+51.372111330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.821774 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.821852 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:50.821829979 +0000 UTC m=+51.372196373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.911815 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.912109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.912188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.912274 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.912349 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:34Z","lastTransitionTime":"2025-10-02T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.922838 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.922925 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923236 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923279 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923301 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923394 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923465 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923550 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923401 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:50.923373595 +0000 UTC m=+51.473739999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:34 crc kubenswrapper[4929]: E1002 11:10:34.923774 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:50.923690973 +0000 UTC m=+51.474057457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:34 crc kubenswrapper[4929]: I1002 11:10:34.957421 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.014458 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.014506 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.014523 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.014548 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.014565 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.118151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.118232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.118251 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.118276 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.118294 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.156074 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.156103 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.156188 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.156281 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.156389 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.156529 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.221432 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.221503 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.221527 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.221556 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.221577 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.324493 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.324549 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.324566 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.324593 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.324611 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.420069 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/0.log" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.425302 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9" exitCode=1 Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.425379 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.426641 4929 scope.go:117] "RemoveContainer" containerID="b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.427617 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.427696 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.427714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.427746 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.427769 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.451711 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.466924 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.489878 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:34Z\\\",\\\"message\\\":\\\"608334 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 11:10:34.608347 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:34.608397 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:34.608406 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:34.608426 6232 factory.go:656] Stopping watch factory\\\\nI1002 11:10:34.608445 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:34.608296 6232 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608470 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 11:10:34.608479 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 11:10:34.608486 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:34.608488 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:34.608463 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 11:10:34.608597 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608640 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.509726 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.526878 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.531317 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.531376 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.531385 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.531402 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.531413 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.543228 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.561552 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.576356 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.594305 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.609788 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.634018 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.634067 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.634085 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.634111 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.634129 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.637043 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.652245 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.674927 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.688156 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.737922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.738047 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.738069 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.738098 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.738132 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.840635 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.840684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.840693 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.840706 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.840714 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.845419 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.845469 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.845486 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.845506 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.845522 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.857430 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.861897 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.861936 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.861948 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.861984 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.861997 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.875048 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.879417 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.879474 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.879492 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.879517 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.879535 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.894294 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.897883 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.897935 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.897953 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.898006 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.898026 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.913671 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.919751 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.919809 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.919826 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.919847 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.919864 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.936174 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:35Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:35 crc kubenswrapper[4929]: E1002 11:10:35.936394 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.943336 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.943416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.943430 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.943449 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:35 crc kubenswrapper[4929]: I1002 11:10:35.943461 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:35Z","lastTransitionTime":"2025-10-02T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.045235 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.045631 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.045641 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.045657 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.045666 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.147931 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.147996 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.148011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.148039 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.148053 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.250869 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.250896 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.250906 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.250922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.250932 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.353396 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.353434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.353445 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.353462 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.353473 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.430926 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/0.log" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.433798 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.434237 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.450027 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.456114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.456162 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.456176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.456200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.456221 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.464135 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.477853 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.500426 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:34Z\\\",\\\"message\\\":\\\"608334 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 11:10:34.608347 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:34.608397 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:34.608406 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:34.608426 6232 factory.go:656] Stopping watch factory\\\\nI1002 11:10:34.608445 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:34.608296 6232 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608470 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 11:10:34.608479 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 11:10:34.608486 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:34.608488 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:34.608463 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 11:10:34.608597 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608640 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.512464 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.527295 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.547775 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.558635 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.558676 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.558687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.558708 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.558721 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.567645 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.591922 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.605888 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.628240 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.639929 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.651476 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.660428 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.660491 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.660505 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.660525 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.660539 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.666303 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:36Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.762397 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.762573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.762586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.762606 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.762616 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.865165 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.865203 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.865212 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.865228 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.865239 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.967571 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.967644 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.967660 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.967684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:36 crc kubenswrapper[4929]: I1002 11:10:36.967700 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:36Z","lastTransitionTime":"2025-10-02T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.261869 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.261904 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:37 crc kubenswrapper[4929]: E1002 11:10:37.262001 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262017 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:37 crc kubenswrapper[4929]: E1002 11:10:37.262134 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:37 crc kubenswrapper[4929]: E1002 11:10:37.262216 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262310 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262402 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262437 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.262449 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.364517 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.364561 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.364570 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.364589 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.364600 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.439276 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/1.log" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.439935 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/0.log" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.442477 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5" exitCode=1 Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.442535 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.442599 4929 scope.go:117] "RemoveContainer" containerID="b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.443652 4929 scope.go:117] "RemoveContainer" containerID="7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5" Oct 02 11:10:37 crc kubenswrapper[4929]: E1002 11:10:37.443843 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.459225 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.467039 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.467073 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.467082 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.467097 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.467106 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.471242 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.494182 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.512465 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:34Z\\\",\\\"message\\\":\\\"608334 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 11:10:34.608347 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:34.608397 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:34.608406 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:34.608426 6232 factory.go:656] Stopping watch factory\\\\nI1002 11:10:34.608445 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:34.608296 6232 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608470 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 11:10:34.608479 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 11:10:34.608486 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:34.608488 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:34.608463 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 11:10:34.608597 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608640 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.526160 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.546337 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.563910 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.569514 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.569572 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.569590 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.569617 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.569636 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.577376 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.593581 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.605118 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.619696 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.637266 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.660086 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.672657 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.672715 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.672740 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.672767 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.672788 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.677012 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.777177 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.777243 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.777264 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.777293 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.777318 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.880798 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.880864 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.880882 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.880908 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.880927 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.891602 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc"] Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.892190 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.894704 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.895172 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.916045 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.941727 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.959424 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.968620 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sh9x\" (UniqueName: \"kubernetes.io/projected/0b304fba-3157-4fb6-a634-ed39fd56821b-kube-api-access-8sh9x\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.968723 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b304fba-3157-4fb6-a634-ed39fd56821b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.968765 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.968833 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.981118 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:37Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.984098 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.984175 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.984203 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.984235 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:37 crc kubenswrapper[4929]: I1002 11:10:37.984260 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:37Z","lastTransitionTime":"2025-10-02T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.004570 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.016643 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.029682 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.051379 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b233e77e412d54ae15df348f20362c357589bb446a1cde414f3be553ec9e9cd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:34Z\\\",\\\"message\\\":\\\"608334 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 11:10:34.608347 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:34.608397 6232 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:34.608406 6232 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:34.608426 6232 factory.go:656] Stopping watch factory\\\\nI1002 11:10:34.608445 6232 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:34.608296 6232 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608470 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 11:10:34.608479 6232 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 11:10:34.608486 6232 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:34.608488 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:34.608463 6232 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 11:10:34.608597 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 11:10:34.608640 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.067161 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.069725 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sh9x\" (UniqueName: \"kubernetes.io/projected/0b304fba-3157-4fb6-a634-ed39fd56821b-kube-api-access-8sh9x\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.069810 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b304fba-3157-4fb6-a634-ed39fd56821b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.069873 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.070011 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.070906 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.071098 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0b304fba-3157-4fb6-a634-ed39fd56821b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.077342 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0b304fba-3157-4fb6-a634-ed39fd56821b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.085999 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.086701 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.086742 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.086755 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.086772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.086784 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.095870 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sh9x\" (UniqueName: \"kubernetes.io/projected/0b304fba-3157-4fb6-a634-ed39fd56821b-kube-api-access-8sh9x\") pod \"ovnkube-control-plane-749d76644c-lh6dc\" (UID: \"0b304fba-3157-4fb6-a634-ed39fd56821b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.101641 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.116462 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.130230 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.140819 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.153353 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.189714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.189797 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.189812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.189833 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.189845 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.209124 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.292365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.292412 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.292425 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.292448 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.292468 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.394418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.394480 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.394496 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.394518 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.394534 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.447505 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" event={"ID":"0b304fba-3157-4fb6-a634-ed39fd56821b","Type":"ContainerStarted","Data":"1e4489be4285490224ae31f194929df6da546079beeb2e664738925338d65afa"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.448845 4929 scope.go:117] "RemoveContainer" containerID="7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5" Oct 02 11:10:38 crc kubenswrapper[4929]: E1002 11:10:38.449238 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.464294 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.484225 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.497583 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.497628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.497642 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.497664 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.497677 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.500462 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.516938 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.527578 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.538135 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.549972 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.560980 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.576267 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.591132 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.599882 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.600472 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.600702 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.600955 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.601199 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.615845 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.634898 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.650912 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.672404 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706259 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706335 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706351 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706377 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706393 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.706571 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:38Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.809230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.809277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.809291 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.809311 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.809328 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.912691 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.912765 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.912789 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.912820 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:38 crc kubenswrapper[4929]: I1002 11:10:38.912843 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:38Z","lastTransitionTime":"2025-10-02T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.016631 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.016683 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.016697 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.016758 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.016773 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.058639 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-59lbt"] Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.059340 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.059440 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.082588 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.099589 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.120849 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.120929 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.120985 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.121019 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.121043 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.121637 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.147163 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.156395 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.156488 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.156583 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.156694 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.157993 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.158197 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.163672 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.179900 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.181522 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggxc\" (UniqueName: \"kubernetes.io/projected/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-kube-api-access-tggxc\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.181618 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.201557 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.220340 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.224434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.224505 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.224531 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.224566 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.224591 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.246171 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.261414 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.278104 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.282625 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tggxc\" (UniqueName: \"kubernetes.io/projected/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-kube-api-access-tggxc\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.282726 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.282866 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.282931 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:39.782908635 +0000 UTC m=+40.333275029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.291110 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.303668 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.313040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggxc\" (UniqueName: \"kubernetes.io/projected/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-kube-api-access-tggxc\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.320392 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.328018 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.328061 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.328073 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.328091 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.328102 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.340747 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.360194 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:39Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.430171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.430220 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.430233 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.430249 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.430258 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.532217 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.532283 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.532306 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.532338 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.532360 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.635584 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.635636 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.635650 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.635669 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.635682 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.737979 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.738007 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.738016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.738031 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.738041 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.787197 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.787317 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:39 crc kubenswrapper[4929]: E1002 11:10:39.787390 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:40.787373437 +0000 UTC m=+41.337739801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.840289 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.840347 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.840371 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.840402 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.840424 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.945000 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.945091 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.945117 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.945156 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:39 crc kubenswrapper[4929]: I1002 11:10:39.945183 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:39Z","lastTransitionTime":"2025-10-02T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.048945 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.049033 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.049051 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.049083 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.049111 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.153295 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.153388 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.153410 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.153445 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.153471 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.184990 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.200283 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.218599 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.232063 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.257127 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.257195 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.257219 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.257254 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.257281 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.261502 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.293043 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.312234 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.325410 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.342182 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.353713 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.359640 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.359696 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.359714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.359755 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.359773 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.368982 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.384002 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.400330 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.424416 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.447142 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.456212 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/1.log" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.462526 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" event={"ID":"0b304fba-3157-4fb6-a634-ed39fd56821b","Type":"ContainerStarted","Data":"beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.462610 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:40Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.463399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.463438 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.463451 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.463467 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.463478 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.568029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.568466 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.568478 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.568497 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.568507 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.673139 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.673201 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.673222 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.673248 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.673267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.776288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.777087 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.777187 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.777281 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.777363 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.797083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:40 crc kubenswrapper[4929]: E1002 11:10:40.797350 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:40 crc kubenswrapper[4929]: E1002 11:10:40.797646 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:42.797623606 +0000 UTC m=+43.347989970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.883192 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.883503 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.883601 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.883711 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.883816 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.987045 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.987128 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.987165 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.987189 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:40 crc kubenswrapper[4929]: I1002 11:10:40.987204 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:40Z","lastTransitionTime":"2025-10-02T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.091386 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.091779 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.091931 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.092109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.092249 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.156179 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:41 crc kubenswrapper[4929]: E1002 11:10:41.156324 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.157097 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:41 crc kubenswrapper[4929]: E1002 11:10:41.157276 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.157790 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:41 crc kubenswrapper[4929]: E1002 11:10:41.157892 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.157987 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:41 crc kubenswrapper[4929]: E1002 11:10:41.158210 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.195349 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.195399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.195410 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.195430 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.195447 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.298391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.298444 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.298463 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.298489 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.298506 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.400771 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.400805 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.400818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.400836 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.400847 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.473007 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" event={"ID":"0b304fba-3157-4fb6-a634-ed39fd56821b","Type":"ContainerStarted","Data":"29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.486250 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.503284 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.503348 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.503367 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.503395 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.503413 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.505193 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.518228 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.545608 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.557044 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.571607 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.599456 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.607619 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.607659 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.607669 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.607685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.607697 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.624505 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.646132 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.665070 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.684124 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.704889 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.711065 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.711124 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.711146 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.711177 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.711199 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.727434 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.747724 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.775309 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.788822 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:41Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.813247 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.813288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.813297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.813310 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.813320 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.915544 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.915586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.915595 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.915611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:41 crc kubenswrapper[4929]: I1002 11:10:41.915622 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:41Z","lastTransitionTime":"2025-10-02T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.018712 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.018757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.018767 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.018782 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.018791 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.125763 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.125845 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.125873 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.125909 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.125940 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.229683 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.229766 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.229788 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.229819 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.229843 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.333101 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.333158 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.333175 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.333197 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.333209 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.439332 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.439368 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.439376 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.439388 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.439398 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.542459 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.542512 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.542532 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.542557 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.542574 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.645704 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.645766 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.645786 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.645812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.645829 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.748538 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.748591 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.748608 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.748633 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.748650 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.817405 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:42 crc kubenswrapper[4929]: E1002 11:10:42.817629 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:42 crc kubenswrapper[4929]: E1002 11:10:42.817711 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:46.817687712 +0000 UTC m=+47.368054116 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.851060 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.851131 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.851151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.851180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.851200 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.953531 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.953577 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.953589 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.953609 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:42 crc kubenswrapper[4929]: I1002 11:10:42.953622 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:42Z","lastTransitionTime":"2025-10-02T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.056128 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.056505 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.056720 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.057032 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.057274 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.156208 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.156232 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:43 crc kubenswrapper[4929]: E1002 11:10:43.156693 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.156351 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:43 crc kubenswrapper[4929]: E1002 11:10:43.157446 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:43 crc kubenswrapper[4929]: E1002 11:10:43.156739 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.156245 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:43 crc kubenswrapper[4929]: E1002 11:10:43.158176 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.161578 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.161683 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.161774 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.161865 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.161951 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.264317 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.264368 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.264382 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.264399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.264412 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.366870 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.367179 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.367426 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.367573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.367669 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.471171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.471203 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.471211 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.471224 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.471234 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.573113 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.573179 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.573200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.573225 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.573248 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.675999 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.676064 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.676087 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.676119 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.676141 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.779595 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.779682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.779706 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.779739 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.779761 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.883140 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.883210 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.883232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.883264 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.883285 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.985434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.985484 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.985500 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.985520 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:43 crc kubenswrapper[4929]: I1002 11:10:43.985539 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:43Z","lastTransitionTime":"2025-10-02T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.088115 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.088159 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.088170 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.088186 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.088197 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.190209 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.190252 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.190263 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.190283 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.190294 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.292927 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.292999 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.293012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.293032 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.293043 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.396279 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.396325 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.396365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.396385 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.396398 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.499698 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.499768 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.499788 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.499817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.499837 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.603577 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.603649 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.603673 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.603703 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.603726 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.707395 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.707477 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.707504 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.707540 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.707564 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.810413 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.810488 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.810508 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.810533 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.810550 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.913196 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.913245 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.913277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.913297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:44 crc kubenswrapper[4929]: I1002 11:10:44.913309 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:44Z","lastTransitionTime":"2025-10-02T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.015648 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.015724 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.015747 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.015772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.015788 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.119371 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.119419 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.119428 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.119442 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.119453 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.156554 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.156619 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.156626 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:45 crc kubenswrapper[4929]: E1002 11:10:45.156723 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.156551 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:45 crc kubenswrapper[4929]: E1002 11:10:45.156887 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:45 crc kubenswrapper[4929]: E1002 11:10:45.157078 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:45 crc kubenswrapper[4929]: E1002 11:10:45.157272 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.223064 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.223237 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.223268 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.223446 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.223543 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.327219 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.327276 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.327292 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.327360 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.327379 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.429851 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.429907 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.429924 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.429948 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.430040 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.532362 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.532391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.532399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.532413 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.532422 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.635434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.635494 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.635512 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.635536 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.635553 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.739759 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.739820 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.739837 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.739868 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.739889 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.843363 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.843423 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.843444 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.843470 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.843488 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.946828 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.946867 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.946880 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.946897 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:45 crc kubenswrapper[4929]: I1002 11:10:45.946908 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:45Z","lastTransitionTime":"2025-10-02T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.049628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.049667 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.049677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.049691 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.049701 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.152983 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.153023 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.153033 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.153051 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.153063 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.255678 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.255736 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.255753 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.255778 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.255796 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.294845 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.294900 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.295357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.295407 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.295425 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.321787 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:46Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.327406 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.327453 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.327471 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.327494 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.327512 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.349221 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:46Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.354130 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.354176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.354194 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.354217 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.354234 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.377154 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:46Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.382307 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.382348 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.382382 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.382399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.382410 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.402793 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:46Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.407859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.407913 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.407932 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.407992 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.408012 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.428954 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:46Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.429269 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.431546 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.431602 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.431617 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.431637 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.431651 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.534326 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.534406 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.534432 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.534467 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.534491 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.637677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.637729 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.637746 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.637770 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.637787 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.740769 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.740818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.740837 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.740863 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.740896 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.844323 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.844391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.844409 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.844438 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.844459 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.865210 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.865507 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:46 crc kubenswrapper[4929]: E1002 11:10:46.865663 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:10:54.865595493 +0000 UTC m=+55.415961947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.947334 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.947381 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.947399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.947418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:46 crc kubenswrapper[4929]: I1002 11:10:46.947430 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:46Z","lastTransitionTime":"2025-10-02T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.050274 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.050328 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.050339 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.050357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.050369 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.152614 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.152684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.152704 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.152730 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.152749 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.156116 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.156163 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.156190 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.156139 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:47 crc kubenswrapper[4929]: E1002 11:10:47.156255 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:47 crc kubenswrapper[4929]: E1002 11:10:47.156343 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:47 crc kubenswrapper[4929]: E1002 11:10:47.156478 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:47 crc kubenswrapper[4929]: E1002 11:10:47.156639 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.255927 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.256043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.256062 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.256092 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.256110 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.358435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.358473 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.358481 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.358498 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.358507 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.461884 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.462046 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.462077 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.462114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.462143 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.565818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.565892 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.565929 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.565996 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.566023 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.670105 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.670214 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.670249 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.670288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.670318 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.774585 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.774657 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.774675 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.774705 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.774725 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.878143 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.878232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.878267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.878300 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.878321 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.982201 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.982267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.982286 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.982313 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:47 crc kubenswrapper[4929]: I1002 11:10:47.982335 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:47Z","lastTransitionTime":"2025-10-02T11:10:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.093418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.093482 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.093503 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.093532 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.093552 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.197167 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.197241 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.197259 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.197285 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.197330 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.301251 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.301832 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.302149 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.302367 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.302573 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.406896 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.407017 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.407043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.407071 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.407091 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.511030 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.511106 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.511130 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.511159 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.511184 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.614987 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.615038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.615050 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.615068 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.615077 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.718382 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.718433 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.718444 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.718471 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.718483 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.821699 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.821747 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.821757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.821776 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.821786 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.925569 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.925732 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.925758 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.925783 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:48 crc kubenswrapper[4929]: I1002 11:10:48.925802 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:48Z","lastTransitionTime":"2025-10-02T11:10:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.029565 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.029653 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.029675 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.029701 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.029718 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.131899 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.131935 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.131943 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.131979 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.131988 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.156521 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.156580 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.156640 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.156652 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:49 crc kubenswrapper[4929]: E1002 11:10:49.156741 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:49 crc kubenswrapper[4929]: E1002 11:10:49.156900 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:49 crc kubenswrapper[4929]: E1002 11:10:49.157029 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:49 crc kubenswrapper[4929]: E1002 11:10:49.157138 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.234325 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.234365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.234375 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.234387 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.234396 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.337164 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.337203 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.337214 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.337231 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.337242 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.439580 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.439614 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.439624 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.439637 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.439645 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.542189 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.542225 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.542235 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.542252 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.542262 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.645422 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.645486 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.645508 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.645535 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.645554 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.749708 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.749781 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.749802 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.749833 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.749851 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.853511 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.853588 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.853599 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.853618 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.853630 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.956282 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.956365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.956390 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.956427 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:49 crc kubenswrapper[4929]: I1002 11:10:49.956455 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:49Z","lastTransitionTime":"2025-10-02T11:10:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.061388 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.061468 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.061492 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.061530 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.061555 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.167873 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.168005 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.168029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.168059 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.168097 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.183804 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.203184 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.224031 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.239192 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.257934 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275048 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275112 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275168 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275179 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.275400 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.293454 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.309169 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.336600 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.358685 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.377557 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.377600 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.377613 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.377630 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.377642 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.380551 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.399713 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.418021 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.431648 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.445786 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480246 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480270 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480286 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.480245 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:50Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.583663 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.583750 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.583775 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.583815 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.583841 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.687832 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.687875 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.687886 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.687901 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.687913 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.791731 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.791797 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.791820 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.791850 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.791870 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.896212 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.897156 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.897188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.897220 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.897242 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:50Z","lastTransitionTime":"2025-10-02T11:10:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.914385 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:10:50 crc kubenswrapper[4929]: E1002 11:10:50.916365 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:11:22.916322472 +0000 UTC m=+83.466688836 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.916491 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:50 crc kubenswrapper[4929]: I1002 11:10:50.916530 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:50 crc kubenswrapper[4929]: E1002 11:10:50.916633 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:50 crc kubenswrapper[4929]: E1002 11:10:50.916676 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:50 crc kubenswrapper[4929]: E1002 11:10:50.916694 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:22.916685062 +0000 UTC m=+83.467051426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:10:50 crc kubenswrapper[4929]: E1002 11:10:50.916718 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:22.916707163 +0000 UTC m=+83.467073527 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.000517 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.000586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.000606 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.000636 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.000657 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.017930 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.018016 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018157 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018178 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018190 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018245 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:23.018229318 +0000 UTC m=+83.568595682 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018256 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018307 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018330 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.018473 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:23.018442144 +0000 UTC m=+83.568808538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.109435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.109478 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.109491 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.109507 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.109517 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.156441 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.156548 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.156648 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.156867 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.156877 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.156920 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.157017 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:51 crc kubenswrapper[4929]: E1002 11:10:51.157074 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.212932 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.213019 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.213037 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.213063 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.213082 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.317077 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.317199 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.317230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.317256 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.317285 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.420206 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.420291 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.420311 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.420345 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.420378 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.523337 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.523404 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.523421 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.523445 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.523462 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.626348 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.626404 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.626420 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.626444 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.626489 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.729742 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.729793 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.729803 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.729818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.729829 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.832629 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.832702 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.832721 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.832749 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.832765 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.935872 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.936563 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.936654 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.936745 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:51 crc kubenswrapper[4929]: I1002 11:10:51.936834 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:51Z","lastTransitionTime":"2025-10-02T11:10:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.039577 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.039632 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.039644 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.039664 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.039682 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.143837 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.143909 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.143928 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.143986 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.144014 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.157870 4929 scope.go:117] "RemoveContainer" containerID="7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.247537 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.248120 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.248135 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.248166 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.248185 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.350138 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.350172 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.350184 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.350200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.350214 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.453072 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.453133 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.453152 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.453175 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.453220 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.516271 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/1.log" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.519863 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.521518 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.535607 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.552040 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.556036 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.556064 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.556073 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.556088 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.556101 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.576656 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.602025 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.619404 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.641656 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.658576 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.658613 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.658644 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.658660 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.658671 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.659822 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.674410 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.684663 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.699357 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.711529 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.727100 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.748685 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.762212 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.762255 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.762266 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.762282 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.762295 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.769127 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.783246 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.803679 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:52Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.865101 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.865144 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.865158 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.865176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.865189 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.968271 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.968345 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.968364 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.968393 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:52 crc kubenswrapper[4929]: I1002 11:10:52.968415 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:52Z","lastTransitionTime":"2025-10-02T11:10:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.071329 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.071392 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.071413 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.071439 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.071456 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.155808 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.155842 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.155935 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:53 crc kubenswrapper[4929]: E1002 11:10:53.156132 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.156168 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:53 crc kubenswrapper[4929]: E1002 11:10:53.156256 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:53 crc kubenswrapper[4929]: E1002 11:10:53.156322 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:53 crc kubenswrapper[4929]: E1002 11:10:53.156353 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.175256 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.175321 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.175338 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.175367 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.175389 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.278160 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.278203 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.278214 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.278228 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.278240 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.381802 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.381846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.381859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.381876 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.381888 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.486190 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.486266 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.486281 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.486301 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.486317 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.527075 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/2.log" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.528088 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/1.log" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.532378 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" exitCode=1 Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.532419 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.532456 4929 scope.go:117] "RemoveContainer" containerID="7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.533792 4929 scope.go:117] "RemoveContainer" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" Oct 02 11:10:53 crc kubenswrapper[4929]: E1002 11:10:53.534161 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.565749 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.583991 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.588902 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.589130 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.589323 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.589500 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.589669 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.606276 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.640805 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8a8f2f3d0791b7dda9a014d1a17743a2065b4ac64ee521d2475688a02f30e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"message\\\":\\\"g/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 11:10:37.050991 6397 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 11:10:37.051027 6397 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 11:10:37.051067 6397 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 11:10:37.051088 6397 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 11:10:37.051112 6397 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 11:10:37.051120 6397 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1002 11:10:37.051132 6397 handler.go:208] Removed *v1.Node event handler 7\\\\nI1002 11:10:37.051162 6397 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 11:10:37.051142 6397 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 11:10:37.051170 6397 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 11:10:37.051144 6397 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1002 11:10:37.051230 6397 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 11:10:37.051348 6397 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 11:10:37.051391 6397 factory.go:656] Stopping watch factory\\\\nI1002 11:10:37.051407 6397 ovnkube.go:599] Stopped ovnkube\\\\nI1002 11:10:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.657773 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.673339 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.687867 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.693896 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.693954 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.694021 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.694066 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.694084 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.705083 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.717619 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.730905 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.744684 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.759902 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.777690 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.797145 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.797200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.797218 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.797242 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.797259 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.798145 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.813106 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.826324 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:53Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.900182 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.900258 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.900279 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.900311 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:53 crc kubenswrapper[4929]: I1002 11:10:53.900334 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:53Z","lastTransitionTime":"2025-10-02T11:10:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.002500 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.002555 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.002569 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.002585 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.002596 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.105608 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.105817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.105904 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.105986 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.106058 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.209184 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.209232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.209241 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.209256 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.209267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.315107 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.315159 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.315173 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.315192 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.315205 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.418792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.418838 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.418855 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.418877 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.418894 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.523477 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.523804 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.524065 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.524288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.524487 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.540569 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/2.log" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.544015 4929 scope.go:117] "RemoveContainer" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" Oct 02 11:10:54 crc kubenswrapper[4929]: E1002 11:10:54.544162 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.567117 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.586521 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.605552 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.617285 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.628157 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.628222 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.628243 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.628275 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.628294 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.630507 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.643155 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.659295 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.676235 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.697565 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.715072 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.725590 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.730994 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.731022 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.731030 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.731043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.731053 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.741485 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.754325 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.771176 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.788883 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.830058 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:54Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.833611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.833681 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.833696 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.833718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.833733 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.935785 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.935817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.935826 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.935840 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.935852 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:54Z","lastTransitionTime":"2025-10-02T11:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:54 crc kubenswrapper[4929]: I1002 11:10:54.963462 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:54 crc kubenswrapper[4929]: E1002 11:10:54.963606 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:54 crc kubenswrapper[4929]: E1002 11:10:54.963663 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:10.963639769 +0000 UTC m=+71.514006133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.038542 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.038610 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.038619 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.038633 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.038658 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.141860 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.141928 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.141946 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.141979 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.141992 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.156413 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.156481 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.156525 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:55 crc kubenswrapper[4929]: E1002 11:10:55.156567 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.156431 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:55 crc kubenswrapper[4929]: E1002 11:10:55.156727 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:55 crc kubenswrapper[4929]: E1002 11:10:55.156816 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:55 crc kubenswrapper[4929]: E1002 11:10:55.156900 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.244847 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.244887 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.244898 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.244913 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.244923 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.325392 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.333289 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.340758 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.347568 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.347607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.347616 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.347631 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.347641 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.355330 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.375749 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.398643 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.413150 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.426168 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.439613 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.450024 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.450062 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.450072 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.450084 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.450094 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.452523 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.465272 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.477288 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.488416 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.500498 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.514913 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.526370 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.535499 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.548955 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:55Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.552366 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.552401 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.552416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.552434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.552448 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.655296 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.655348 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.655361 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.655378 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.655390 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.758090 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.758133 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.758145 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.758159 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.758168 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.860925 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.860991 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.861003 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.861020 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.861031 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.963889 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.963981 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.963994 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.964011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:55 crc kubenswrapper[4929]: I1002 11:10:55.964021 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:55Z","lastTransitionTime":"2025-10-02T11:10:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.066229 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.066279 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.066288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.066303 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.066311 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.168200 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.168238 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.168247 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.168260 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.168269 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.271223 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.271289 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.271308 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.271330 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.271347 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.374345 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.374408 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.374426 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.374449 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.374469 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.477914 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.477965 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.477978 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.477993 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.478003 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.580731 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.580812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.580830 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.580856 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.580883 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.683206 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.683311 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.683329 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.683354 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.683371 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.774291 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.774340 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.774349 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.774365 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.774375 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.789417 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:56Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.794503 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.794575 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.794598 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.794621 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.794641 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.810644 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:56Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.815524 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.815578 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.815740 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.815759 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.815773 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.833836 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:56Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.837651 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.837676 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.837685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.837699 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.837708 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.852529 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:56Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.857151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.857188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.857197 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.857213 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.857222 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.872200 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:10:56Z is after 2025-08-24T17:21:41Z" Oct 02 11:10:56 crc kubenswrapper[4929]: E1002 11:10:56.872352 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.873853 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.873894 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.873907 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.873925 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.873937 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.977389 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.977452 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.977463 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.977477 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:56 crc kubenswrapper[4929]: I1002 11:10:56.977486 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:56Z","lastTransitionTime":"2025-10-02T11:10:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.079534 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.079565 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.079589 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.079604 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.079615 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.155629 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.155676 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.155766 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:57 crc kubenswrapper[4929]: E1002 11:10:57.155882 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.155924 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:57 crc kubenswrapper[4929]: E1002 11:10:57.156080 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:57 crc kubenswrapper[4929]: E1002 11:10:57.156310 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:57 crc kubenswrapper[4929]: E1002 11:10:57.156372 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.182246 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.182301 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.182319 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.182342 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.182360 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.285222 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.285251 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.285260 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.285277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.285288 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.389613 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.389700 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.389719 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.389751 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.389774 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.492156 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.492219 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.492230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.492261 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.492272 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.594925 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.594996 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.595007 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.595021 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.595032 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.697992 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.698035 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.698043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.698056 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.698065 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.800542 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.800589 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.800598 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.800615 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.800631 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.903847 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.903904 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.903925 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.903993 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:57 crc kubenswrapper[4929]: I1002 11:10:57.904029 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:57Z","lastTransitionTime":"2025-10-02T11:10:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.007023 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.007081 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.007098 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.007124 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.007143 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.110195 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.110270 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.110295 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.110330 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.110355 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.213338 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.213394 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.213411 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.213435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.213455 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.317523 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.317829 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.317923 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.318053 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.318199 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.421073 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.421132 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.421151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.421189 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.421206 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.524044 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.524123 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.524148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.524178 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.524199 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.626462 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.626500 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.626511 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.626527 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.626538 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.729572 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.729607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.729615 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.729628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.729637 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.833068 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.833144 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.833286 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.833326 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.833349 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.937065 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.937118 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.937127 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.937142 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:58 crc kubenswrapper[4929]: I1002 11:10:58.937152 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:58Z","lastTransitionTime":"2025-10-02T11:10:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.040416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.040489 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.040507 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.040529 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.040547 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.143507 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.143567 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.143584 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.143648 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.143670 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.156244 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.156308 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.156340 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:10:59 crc kubenswrapper[4929]: E1002 11:10:59.156375 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.156442 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:10:59 crc kubenswrapper[4929]: E1002 11:10:59.156627 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:10:59 crc kubenswrapper[4929]: E1002 11:10:59.156741 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:10:59 crc kubenswrapper[4929]: E1002 11:10:59.156884 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.247240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.247514 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.247585 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.247691 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.247760 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.350920 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.350988 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.350997 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.351010 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.351018 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.454110 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.454525 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.454685 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.454818 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.454937 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.558483 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.558535 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.558553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.558578 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.558595 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.660806 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.660844 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.660855 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.660872 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.660882 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.764008 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.764041 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.764049 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.764061 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.764069 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.867339 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.867374 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.867384 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.867397 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.867408 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.970100 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.970160 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.970183 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.970211 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:10:59 crc kubenswrapper[4929]: I1002 11:10:59.970232 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:10:59Z","lastTransitionTime":"2025-10-02T11:10:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.072731 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.072767 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.072780 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.072795 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.072808 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.175357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.175558 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.175566 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.175580 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.175589 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.180065 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.194164 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.208413 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.226660 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.240646 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.257664 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.275715 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.277531 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.277563 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.277575 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.277592 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.277604 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.292207 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.307434 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.324163 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.336181 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.347414 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.360494 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.375866 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.379468 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.379516 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.379528 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.379546 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.379559 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.387074 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.399056 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.414309 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:00Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.482023 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.482068 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.482079 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.482097 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.482111 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.584743 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.584788 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.584797 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.584812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.584823 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.687748 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.687794 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.687825 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.687846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.687858 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.790467 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.790508 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.790519 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.790536 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.790546 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.893684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.893727 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.893739 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.893753 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:00 crc kubenswrapper[4929]: I1002 11:11:00.893763 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:00Z","lastTransitionTime":"2025-10-02T11:11:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.003189 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.003277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.003291 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.003317 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.003330 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.105547 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.105586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.105599 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.105617 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.105630 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.156321 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:01 crc kubenswrapper[4929]: E1002 11:11:01.156586 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.156356 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:01 crc kubenswrapper[4929]: E1002 11:11:01.156790 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.156328 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:01 crc kubenswrapper[4929]: E1002 11:11:01.157025 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.156356 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:01 crc kubenswrapper[4929]: E1002 11:11:01.157464 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.208344 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.208398 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.208410 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.208428 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.208440 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.311234 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.311267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.311275 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.311288 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.311297 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.413713 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.413754 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.413763 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.413779 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.413789 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.516124 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.516198 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.516218 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.516247 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.516267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.619682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.619751 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.619772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.619795 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.619847 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.723364 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.723427 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.723439 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.723454 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.723464 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.826762 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.826828 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.826840 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.826860 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.826871 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.930731 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.930797 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.930816 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.930846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:01 crc kubenswrapper[4929]: I1002 11:11:01.930866 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:01Z","lastTransitionTime":"2025-10-02T11:11:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.034071 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.034582 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.034682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.034796 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.034896 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.138117 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.138491 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.138575 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.138654 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.138717 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.242587 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.243541 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.243751 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.243950 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.244267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.347864 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.347917 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.347927 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.347950 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.348000 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.450607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.450651 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.450662 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.450679 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.450690 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.554449 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.554522 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.554543 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.554568 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.554586 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.657500 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.657573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.657591 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.657616 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.657637 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.760196 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.760251 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.760271 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.760292 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.760306 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.862526 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.862576 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.862588 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.862607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.862620 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.964639 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.964670 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.964682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.964697 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:02 crc kubenswrapper[4929]: I1002 11:11:02.964708 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:02Z","lastTransitionTime":"2025-10-02T11:11:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.094229 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.094351 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.094373 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.094395 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.094416 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.156175 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:03 crc kubenswrapper[4929]: E1002 11:11:03.156331 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.156409 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:03 crc kubenswrapper[4929]: E1002 11:11:03.156505 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.156549 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:03 crc kubenswrapper[4929]: E1002 11:11:03.156620 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.156610 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:03 crc kubenswrapper[4929]: E1002 11:11:03.156717 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.197460 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.197496 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.197506 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.197520 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.197529 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.300860 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.300915 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.300936 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.300983 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.301001 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.402945 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.403017 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.403032 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.403054 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.403072 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.505874 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.505918 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.505927 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.505941 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.506005 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.609280 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.609420 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.609436 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.609455 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.609468 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.712508 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.712558 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.712571 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.712590 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.712601 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.816016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.816067 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.816075 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.816090 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.816102 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.919300 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.919368 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.919386 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.919416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:03 crc kubenswrapper[4929]: I1002 11:11:03.919434 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:03Z","lastTransitionTime":"2025-10-02T11:11:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.021612 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.021662 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.021681 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.021710 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.021728 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.125694 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.125853 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.125879 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.125909 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.125930 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.228643 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.228694 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.228711 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.228734 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.228751 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.331429 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.331464 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.331471 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.331485 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.331493 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.434845 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.434894 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.434903 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.434919 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.434928 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.537600 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.537666 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.537688 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.537718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.537739 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.640949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.641007 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.641016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.641030 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.641039 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.743735 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.743772 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.743781 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.743794 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.743803 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.846786 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.846825 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.846835 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.846850 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.846860 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.949550 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.949611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.949630 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.949654 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:04 crc kubenswrapper[4929]: I1002 11:11:04.949670 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:04Z","lastTransitionTime":"2025-10-02T11:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.052573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.052656 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.052676 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.052703 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.052722 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155526 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155582 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155612 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155637 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155643 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155574 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155700 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.155637 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:05 crc kubenswrapper[4929]: E1002 11:11:05.155761 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:05 crc kubenswrapper[4929]: E1002 11:11:05.155676 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:05 crc kubenswrapper[4929]: E1002 11:11:05.155877 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:05 crc kubenswrapper[4929]: E1002 11:11:05.155998 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.258807 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.258878 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.258895 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.259356 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.259410 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.362180 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.362275 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.362287 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.362300 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.362310 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.465134 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.465184 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.465196 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.465214 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.465227 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.567572 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.567740 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.567760 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.567779 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.567792 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.671230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.671278 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.671289 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.671307 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.671320 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.773984 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.774016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.774027 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.774041 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.774050 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.876411 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.876792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.876932 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.877128 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.877274 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.980021 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.980066 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.980078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.980096 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:05 crc kubenswrapper[4929]: I1002 11:11:05.980108 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:05Z","lastTransitionTime":"2025-10-02T11:11:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.082700 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.082996 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.083008 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.083025 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.083037 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.157388 4929 scope.go:117] "RemoveContainer" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" Oct 02 11:11:06 crc kubenswrapper[4929]: E1002 11:11:06.157723 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.185817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.185849 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.185858 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.185872 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.185880 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.288320 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.288352 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.288360 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.288373 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.288384 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.391815 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.391883 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.391906 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.391935 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.391981 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.495179 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.495246 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.495262 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.495287 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.495304 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.598038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.598066 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.598074 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.598087 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.598098 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.700598 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.700664 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.700680 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.700699 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.700715 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.802946 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.802986 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.802995 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.803011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.803021 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.905840 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.905882 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.905892 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.905910 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:06 crc kubenswrapper[4929]: I1002 11:11:06.905920 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:06Z","lastTransitionTime":"2025-10-02T11:11:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.008640 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.008687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.008698 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.008714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.008724 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.109851 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.109895 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.109904 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.109919 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.109929 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.124545 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.133805 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.133836 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.133846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.133861 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.133871 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.151199 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155720 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155790 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155801 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155820 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155831 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155874 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155938 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155881 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.155792 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.156041 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.156105 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.156232 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.156400 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.169262 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.173171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.173232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.173249 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.173272 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.173290 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.188240 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.191723 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.191767 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.191780 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.191798 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.191810 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.206441 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:07Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:07 crc kubenswrapper[4929]: E1002 11:11:07.206541 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.207698 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.207741 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.207753 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.207763 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.207771 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.315244 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.315305 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.315315 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.315334 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.315344 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.417744 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.417784 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.417792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.417808 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.417820 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.519949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.520012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.520024 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.520039 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.520050 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.622391 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.622447 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.622462 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.622479 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.622491 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.725169 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.725210 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.725221 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.725240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.725253 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.827103 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.827142 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.827151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.827165 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.827175 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.929465 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.929504 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.929514 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.929531 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:07 crc kubenswrapper[4929]: I1002 11:11:07.929541 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:07Z","lastTransitionTime":"2025-10-02T11:11:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.031616 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.031669 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.031681 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.031701 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.031714 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.134871 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.134937 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.134954 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.135006 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.135025 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.237129 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.237169 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.237181 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.237195 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.237206 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.339682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.339760 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.339780 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.339804 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.339821 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.442423 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.442464 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.442473 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.442487 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.442496 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.544718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.544817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.544829 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.544847 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.544857 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.646613 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.646653 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.646663 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.646677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.646687 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.749801 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.749841 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.749849 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.749863 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.749872 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.853099 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.853152 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.853161 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.853177 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.853187 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.956342 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.956404 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.956422 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.956448 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:08 crc kubenswrapper[4929]: I1002 11:11:08.956467 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:08Z","lastTransitionTime":"2025-10-02T11:11:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.058915 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.058995 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.059015 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.059038 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.059056 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.155657 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.155695 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.155681 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.155667 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:09 crc kubenswrapper[4929]: E1002 11:11:09.155822 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:09 crc kubenswrapper[4929]: E1002 11:11:09.155923 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:09 crc kubenswrapper[4929]: E1002 11:11:09.156047 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:09 crc kubenswrapper[4929]: E1002 11:11:09.156181 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.162860 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.162922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.162945 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.163002 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.163026 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.265420 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.265461 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.265470 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.265487 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.265499 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.368738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.368809 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.368826 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.368852 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.368904 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.472357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.472421 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.472438 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.472461 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.472483 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.575116 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.575183 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.575206 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.575230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.575249 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.678281 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.678313 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.678323 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.678339 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.678350 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.780844 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.780891 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.780949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.781012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.781045 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.884065 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.884114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.884131 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.884154 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.884171 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.987097 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.987140 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.987151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.987167 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:09 crc kubenswrapper[4929]: I1002 11:11:09.987178 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:09Z","lastTransitionTime":"2025-10-02T11:11:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.089670 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.089738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.089763 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.089791 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.089808 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.167691 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.177362 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.193924 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.193917 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.194047 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.194068 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.194095 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.194118 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.211213 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.222601 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.233210 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.243521 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.254182 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.263981 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.273279 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.294153 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.297026 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.297078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.297095 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.297126 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.297143 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.313376 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.328140 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.346045 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.365253 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.378476 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.390047 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.398889 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.398915 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.398924 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.398937 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.398946 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.403660 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:10Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.500786 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.500825 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.500837 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.500855 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.500867 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.602518 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.602567 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.602583 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.602603 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.602619 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.704419 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.704462 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.704470 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.704487 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.704497 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.807224 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.807267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.807276 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.807290 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.807299 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.909994 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.910042 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.910058 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.910078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:10 crc kubenswrapper[4929]: I1002 11:11:10.910092 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:10Z","lastTransitionTime":"2025-10-02T11:11:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.011893 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.011931 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.011939 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.011953 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.011978 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.035584 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.035715 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.035785 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:43.035763824 +0000 UTC m=+103.586130188 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.114364 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.114421 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.114434 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.114453 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.114466 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.155976 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.156057 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.155976 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.156111 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.156183 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.156056 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.156277 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:11 crc kubenswrapper[4929]: E1002 11:11:11.156367 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.216778 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.216817 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.216827 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.216841 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.216851 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.318811 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.318867 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.318885 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.318912 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.318934 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.421544 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.421601 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.421611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.421626 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.421636 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.524157 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.524198 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.524209 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.524225 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.524235 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.627024 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.627066 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.627078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.627094 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.627106 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.729678 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.729738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.729754 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.729776 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.729788 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.832076 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.832142 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.832158 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.832178 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.832194 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.935599 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.935656 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.935668 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.935687 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:11 crc kubenswrapper[4929]: I1002 11:11:11.935702 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:11Z","lastTransitionTime":"2025-10-02T11:11:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.038505 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.038544 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.038553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.038571 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.038580 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.141053 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.141094 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.141108 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.141128 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.141140 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.242812 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.242844 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.242853 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.242865 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.242878 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.344550 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.344595 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.344607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.344623 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.344635 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.447621 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.447690 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.447711 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.447733 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.447749 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.550133 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.550173 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.550182 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.550196 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.550206 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.652661 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.652699 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.652709 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.652723 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.652732 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.754697 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.754742 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.754757 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.754777 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.754794 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.858524 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.858566 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.858576 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.858592 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.858604 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.961052 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.961127 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.961151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.961182 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:12 crc kubenswrapper[4929]: I1002 11:11:12.961205 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:12Z","lastTransitionTime":"2025-10-02T11:11:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.064807 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.064850 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.064864 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.064886 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.064898 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.156202 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.156288 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:13 crc kubenswrapper[4929]: E1002 11:11:13.156350 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.156398 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.156405 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:13 crc kubenswrapper[4929]: E1002 11:11:13.156540 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:13 crc kubenswrapper[4929]: E1002 11:11:13.156615 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:13 crc kubenswrapper[4929]: E1002 11:11:13.156685 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.170107 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.170171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.170186 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.170206 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.170225 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.274129 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.274244 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.274265 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.274290 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.274308 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.377014 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.377091 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.377113 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.377510 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.377752 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.480441 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.480482 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.480491 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.480505 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.480514 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.582532 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.582577 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.582587 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.582602 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.582612 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.601714 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/0.log" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.601774 4929 generic.go:334] "Generic (PLEG): container finished" podID="4599e863-12c0-4c39-a873-a46012459555" containerID="91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64" exitCode=1 Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.601808 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerDied","Data":"91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.602253 4929 scope.go:117] "RemoveContainer" containerID="91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.616154 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.630298 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.641852 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.661403 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.677904 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.684349 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.684384 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.684397 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.684416 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.684428 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.695484 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.709048 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.725032 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.739160 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.750623 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.763370 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.777116 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.786807 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.786877 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.786892 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.786911 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.786923 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.796622 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.812973 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.825658 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.842289 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.857321 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.883130 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:13Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.889401 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.889460 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.889479 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.889504 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.889526 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.991671 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.991708 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.991718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.991733 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:13 crc kubenswrapper[4929]: I1002 11:11:13.991743 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:13Z","lastTransitionTime":"2025-10-02T11:11:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.094201 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.094240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.094252 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.094270 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.094289 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.196577 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.196615 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.196626 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.196641 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.196653 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.298512 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.298579 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.298596 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.298621 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.298639 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.401345 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.401447 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.401469 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.401499 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.401524 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.503253 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.503297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.503308 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.503325 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.503338 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.611353 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.611435 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.611456 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.611483 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.611513 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.615860 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/0.log" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.615914 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerStarted","Data":"d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.631786 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.643054 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.657928 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.684272 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.701645 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.714738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.714780 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.714792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.714811 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.714824 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.715611 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.734597 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.746263 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.758487 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.771945 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.786400 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.805240 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.817296 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.817337 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.817346 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.817362 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.817373 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.818939 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.828815 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.844839 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.863027 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.877421 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.887367 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:14Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.919503 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.919546 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.919558 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.919574 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:14 crc kubenswrapper[4929]: I1002 11:11:14.919587 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:14Z","lastTransitionTime":"2025-10-02T11:11:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.021731 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.021784 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.021804 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.021822 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.021834 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.123706 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.123760 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.123774 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.123793 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.123809 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.155878 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.156024 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.156070 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:15 crc kubenswrapper[4929]: E1002 11:11:15.156042 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.156078 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:15 crc kubenswrapper[4929]: E1002 11:11:15.156181 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:15 crc kubenswrapper[4929]: E1002 11:11:15.156269 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:15 crc kubenswrapper[4929]: E1002 11:11:15.156384 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.226213 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.226279 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.226297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.226323 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.226341 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.329659 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.329694 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.329705 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.329722 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.329746 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.432000 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.432062 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.432079 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.432101 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.432118 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.535091 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.535148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.535165 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.535188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.535206 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.637768 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.637829 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.637845 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.637870 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.637888 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.740561 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.740614 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.740630 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.740652 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.740670 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.843596 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.843647 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.843659 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.843675 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.843685 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.945773 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.945833 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.945856 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.945906 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:15 crc kubenswrapper[4929]: I1002 11:11:15.945934 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:15Z","lastTransitionTime":"2025-10-02T11:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.048601 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.048636 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.048645 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.048658 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.048666 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.151169 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.151240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.151253 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.151274 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.151288 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.254952 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.255061 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.255082 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.255109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.255128 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.357620 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.357677 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.357694 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.357716 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.357733 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.460330 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.460914 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.461148 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.461321 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.461460 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.565047 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.565119 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.565136 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.565162 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.565180 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.669071 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.669132 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.669151 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.669176 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.669196 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.772940 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.773063 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.773085 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.773119 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.773142 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.875591 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.875663 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.875682 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.875711 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.875730 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.979924 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.980028 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.980048 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.980073 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:16 crc kubenswrapper[4929]: I1002 11:11:16.980090 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:16Z","lastTransitionTime":"2025-10-02T11:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.083152 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.083207 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.083219 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.083240 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.083254 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.156219 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.156296 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.156223 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.156371 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.156553 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.156740 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.156872 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.157066 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.158178 4929 scope.go:117] "RemoveContainer" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.186406 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.186464 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.186477 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.186496 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.186508 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.289746 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.289803 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.289815 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.289833 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.290095 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.393939 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.394029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.394050 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.394078 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.394098 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.471031 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.471106 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.471126 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.471152 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.471172 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.492878 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.497058 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.497107 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.497116 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.497134 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.497144 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.516286 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.521182 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.521222 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.521234 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.521251 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.521264 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.536779 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.540507 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.540536 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.540545 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.540559 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.540568 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.558789 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.562403 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.562440 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.562451 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.562470 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.562482 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.577534 4929 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4f053f54-a5ea-4e60-9d09-e9d37bc5f0a1\\\",\\\"systemUUID\\\":\\\"0ee67423-5105-4391-ab46-c42062aff8c4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: E1002 11:11:17.577755 4929 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.582020 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.582069 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.582084 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.582114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.582130 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.629997 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/2.log" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.633230 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.633712 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.654288 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.674597 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.684315 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.684378 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.684397 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.684418 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.684435 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.697730 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.725345 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.739158 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.751704 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.762571 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.772469 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.780574 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.787033 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.787064 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.787072 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.787085 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.787094 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.789596 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.799345 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.812110 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.823282 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.834363 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.846126 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.860150 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.871175 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.882203 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:17Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.888949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.889001 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.889012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.889027 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.889037 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.991667 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.991707 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.991716 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.991732 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:17 crc kubenswrapper[4929]: I1002 11:11:17.991741 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:17Z","lastTransitionTime":"2025-10-02T11:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.094166 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.094209 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.094221 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.094239 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.094252 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.196898 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.196929 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.196939 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.196952 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.196981 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.299393 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.299433 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.299441 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.299456 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.299466 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.402730 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.402763 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.402771 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.402785 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.402795 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.504948 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.505012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.505029 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.505045 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.505056 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.608622 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.608690 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.608709 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.608735 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.608751 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.638501 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/3.log" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.639420 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/2.log" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.642462 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" exitCode=1 Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.642524 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.642578 4929 scope.go:117] "RemoveContainer" containerID="b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.643806 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:11:18 crc kubenswrapper[4929]: E1002 11:11:18.644189 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.660300 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.677070 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.690785 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.712718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.712773 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.712792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.712814 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.712831 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.713380 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c7b7ea8db7bcd4da19a191c5d1ab73939388d3371164502ec90e94476faf08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:10:53Z\\\",\\\"message\\\":\\\"local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 11:10:53.047585 6625 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1002 11:10:53.047598 6625 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-kxz86\\\\nI1002 11:10:53.047589 6625 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 11:10:53.047467 6625 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1002 11:10:53.047616 6625 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-ident\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:18Z\\\",\\\"message\\\":\\\"11:11:18.074152 6972 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI1002 11:11:18.074169 6972 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.8832ms\\\\nI1002 11:11:18.074184 6972 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator for network=default\\\\nI1002 11:11:18.074184 6972 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1002 11:11:18.074202 6972 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.729066 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.747146 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.765648 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.778116 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.790192 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.798657 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.809309 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.815808 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.815859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.815877 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.815901 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.815918 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.828444 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.845449 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.865368 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.883243 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.896008 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.911108 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.917723 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.917764 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.917782 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.917804 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.917856 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:18Z","lastTransitionTime":"2025-10-02T11:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:18 crc kubenswrapper[4929]: I1002 11:11:18.925038 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:18Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.019753 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.019789 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.019801 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.019816 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.019829 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.122187 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.122243 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.122254 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.122266 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.122276 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.155933 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.155993 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.156053 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:19 crc kubenswrapper[4929]: E1002 11:11:19.156050 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:19 crc kubenswrapper[4929]: E1002 11:11:19.156092 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.156135 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:19 crc kubenswrapper[4929]: E1002 11:11:19.156142 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:19 crc kubenswrapper[4929]: E1002 11:11:19.156282 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.224719 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.224782 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.224799 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.224824 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.224845 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.327441 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.327634 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.327655 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.327679 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.327696 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.429949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.429997 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.430007 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.430023 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.430035 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.532581 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.532652 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.532718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.532786 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.532810 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.635794 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.635846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.635893 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.635915 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.635931 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.648405 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/3.log" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.654277 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:11:19 crc kubenswrapper[4929]: E1002 11:11:19.654545 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.673044 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.689376 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.708531 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.726560 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.738619 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.738686 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.738707 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.738733 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.738752 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.748858 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.769100 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.789596 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.808080 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.829589 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.842483 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.842559 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.842586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.842630 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.842658 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.849127 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.861648 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.881819 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.893151 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.911854 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.939851 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:18Z\\\",\\\"message\\\":\\\"11:11:18.074152 6972 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI1002 11:11:18.074169 6972 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.8832ms\\\\nI1002 11:11:18.074184 6972 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator for network=default\\\\nI1002 11:11:18.074184 6972 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1002 11:11:18.074202 6972 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:11:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.947050 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.947099 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.947118 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.947142 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.947366 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:19Z","lastTransitionTime":"2025-10-02T11:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.960187 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.977444 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:19 crc kubenswrapper[4929]: I1002 11:11:19.995535 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:19Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.052063 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.052171 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.052197 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.052231 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.052269 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.155283 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.155315 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.155325 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.155338 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.155347 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.178981 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gbz4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4599e863-12c0-4c39-a873-a46012459555\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:11:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:13Z\\\",\\\"message\\\":\\\"2025-10-02T11:10:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c\\\\n2025-10-02T11:10:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_904a5ede-066e-45c9-ad47-4681c50f7c0c to /host/opt/cni/bin/\\\\n2025-10-02T11:10:28Z [verbose] multus-daemon started\\\\n2025-10-02T11:10:28Z [verbose] Readiness Indicator file check\\\\n2025-10-02T11:11:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:11:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6pxn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gbz4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.207826 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T11:11:18Z\\\",\\\"message\\\":\\\"11:11:18.074152 6972 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI1002 11:11:18.074169 6972 services_controller.go:360] Finished syncing service certified-operators on namespace openshift-marketplace for network=default : 1.8832ms\\\\nI1002 11:11:18.074184 6972 services_controller.go:356] Processing sync for service openshift-machine-api/machine-api-operator for network=default\\\\nI1002 11:11:18.074184 6972 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1002 11:11:18.074202 6972 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verif\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:11:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pkltr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5fzl7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.224632 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.240148 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d4cd1afe07724ae42f39af12c045858861b9c92451112c3d3092bb1e4682c11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.255893 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fee86787a6635a95c3f6dc8e3f09d2dfac9e94bac71a9736edfc2d88f0ec012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52f0c8c70e7a1ddd9cedbea712a0701af87d25ad7c63cb5316b8f7d73cbfed8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.258318 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.258407 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.258426 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.258486 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.258506 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.275649 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.299078 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-q4fb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce61e3b0-e445-41c1-be86-ac3e51cffbe1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://36b89cd871e160b9a1969c3e0ff2925cce3aed1ca2b256debd2a93ba6ff1ae7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwkb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-q4fb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.316773 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7hr2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6013d401-6138-4c35-9a72-00a269b5c765\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27ac2eefba73650ed557d3912481bec87c35bc80093ebae0709849106b1d9b45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr75p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7hr2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.335180 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b304fba-3157-4fb6-a634-ed39fd56821b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beb702641eb3fb4953dfdb1089715ec27a32b2fe05c81b3bec44acb2fa1ffba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29b1fafad51868e7b4401176441e72a0dcf1aae9526e2946e9d7cc2b1eadaabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8sh9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lh6dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.354272 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76212086-7806-40e0-ae43-1f865d46c5aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb128aa3b8011dffa200d5ed83903cb79a9ddd1ed93f5ee3c3672ea5c506674d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e3351cae0f906c7aac8d3368ccee514e3a67184ecad7601e4008915a9fc547\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de755a06e4d19bf63d23fab9488fc2a196712c4516b6a6c2ce30bff29609e154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a3ffc927e7a4a962e8f4c1ca486d000c5b4182837368faf1eb6c2b41dca5ff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c58f36f013f446a1b6e3cc9d28d20c17338a7d3eb1183f0e5fd352e37206c519\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T11:10:18Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1002 11:10:03.633138 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 11:10:03.634814 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2982099834/tls.crt::/tmp/serving-cert-2982099834/tls.key\\\\\\\"\\\\nI1002 11:10:18.951423 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 11:10:18.954975 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 11:10:18.955003 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 11:10:18.955307 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 11:10:18.955325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 11:10:18.961824 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 11:10:18.961852 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961858 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 11:10:18.961863 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 11:10:18.961867 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 11:10:18.961871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 11:10:18.961875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 11:10:18.962119 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 11:10:18.964312 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d86589e28e7d1bf709bd59138e044d3d0eac0f7f44eb70be9e0c12aa0319881\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fce9272652e71b75e6ded86ece52322562415c4b2a9c2e1e9574ec4fb26ac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.361133 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.361196 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.361211 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.361230 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.361241 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.372680 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf108b35-fad1-4b86-8ade-c15b74be0fc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422a4a7908e03732315e88abcb06f74dcc33995911770b54311a7faaabcbbe07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb4afb1a51a6d0d88d890ce0b1f0db5cab3403226dcd82cc603ed1b97485e9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e463c612a69594762adc6767a7c7566d17de9014fa0d3f4df3fd98c27f3c9e7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe9f1bebac5c3ca51dd624cdbbc652058df770cca4a4c025ca6e79a1efca0f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.393432 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db1642b8d5d5a748010192cbb23a9d40d60c982072bfd0bb4f7c714e5bcf454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.412820 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.434537 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kxz86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"840bd011-2ac2-422e-adc5-5de6c717fd54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7de8ecc1eaf284fc103eb43e081d63fa4760b73fb067d48095120a908795e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98ae13fe6e1e486c83543422d2e6a64100cf0cce7f62eab23ed1085340f85821\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59fd6faa6be1713a12c8596be2116b287258891b746945ae0fa6937d800b906b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18b5cd4181aa50af3e0a9de3556dc3b6c5ce4ef1e7bd1527d1ff5332dbca3e7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae0443ea7ef8fb634b8ac6bacf8830c6857d44a8f85cee3a235e735582176f6d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ded28b54582274b7d3f8ecedf965b0649ba25eade00d36cbf759c8c5fd88812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2289d7b52e621eb1580e3b933eb4c863f55bc6625c3ed5827c44a0d87d52aac2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xrfbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kxz86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.451080 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b4b5329-0385-4f39-9d63-70284421e448\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1eac5c3882a9201214c1597cac03ebdd9e2f17ba697e06743c28e77db35dbf0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hpmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8j488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465220 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465268 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465280 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465298 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465313 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.465590 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-59lbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tggxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-59lbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.481084 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f677e6f9-47cd-4dc4-a6ad-83012af1dbb9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c7c9e01b64852dc7a9089eeaa3732913b1b4dc501875caabd16f859dede2bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60d4c235ce1dcb55f5f065412fb6e36b1afbe8ba823dd055ff870c6e8db5fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc00f2e5e9234830c0cc5478154f99cefe8097d2f33a2947c56f3c8aad00fcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa7509e27834acf41bac00ff8c430d78a5b245f71d744ca48eedc64fc14b5a4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.492378 4929 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98554be9-1973-4c69-ac48-3d4552d5d59c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T11:10:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8c14ade5b531de7d45ae71259cb0d04b23ad2785a37257732a83007134ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T11:10:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f51439e9ca14ffe8a2a51e8a32adec6cb798db8f45ce42e0f72d1f48e84649f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T11:10:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T11:10:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T11:10:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T11:11:20Z is after 2025-08-24T17:21:41Z" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.568232 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.568586 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.568628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.568653 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.568671 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.673109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.673186 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.673213 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.673245 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.673267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.776057 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.776091 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.776099 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.776111 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.776120 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.878698 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.878781 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.878800 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.878826 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.878846 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.980802 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.980849 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.980859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.980875 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:20 crc kubenswrapper[4929]: I1002 11:11:20.980884 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:20Z","lastTransitionTime":"2025-10-02T11:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.083297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.083338 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.083349 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.083367 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.083379 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.155769 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.155841 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.155851 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:21 crc kubenswrapper[4929]: E1002 11:11:21.155894 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.155987 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:21 crc kubenswrapper[4929]: E1002 11:11:21.155983 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:21 crc kubenswrapper[4929]: E1002 11:11:21.156067 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:21 crc kubenswrapper[4929]: E1002 11:11:21.156308 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.186849 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.186904 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.186922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.186945 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.186982 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.290718 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.290781 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.290819 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.290859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.290883 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.394942 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.395146 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.395179 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.395210 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.395228 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.498265 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.498307 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.498319 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.498336 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.498349 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.601219 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.601275 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.601286 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.601304 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.601318 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.703886 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.703924 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.703933 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.703948 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.703991 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.807035 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.807097 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.807105 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.807140 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.807150 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.910553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.910611 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.910628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.910651 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:21 crc kubenswrapper[4929]: I1002 11:11:21.910669 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:21Z","lastTransitionTime":"2025-10-02T11:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.013790 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.013846 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.013863 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.013887 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.013904 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.117155 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.117257 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.117277 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.117299 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.117317 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.219714 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.219755 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.219766 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.219778 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.219786 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.322887 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.322989 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.323015 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.323045 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.323067 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.426534 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.426588 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.426604 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.426627 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.426644 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.529720 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.529790 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.529813 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.529839 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.529858 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.632268 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.632339 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.632362 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.632390 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.632410 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.734868 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.734942 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.735005 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.735042 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.735063 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.838034 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.838096 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.838114 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.838182 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.838204 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.941564 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.941623 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.941639 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.941663 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.941681 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:22Z","lastTransitionTime":"2025-10-02T11:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.959077 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:11:22 crc kubenswrapper[4929]: E1002 11:11:22.959255 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:26.95923062 +0000 UTC m=+147.509597014 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.959354 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:22 crc kubenswrapper[4929]: I1002 11:11:22.959394 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:22 crc kubenswrapper[4929]: E1002 11:11:22.959515 4929 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:11:22 crc kubenswrapper[4929]: E1002 11:11:22.959579 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:26.959565291 +0000 UTC m=+147.509931695 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 11:11:22 crc kubenswrapper[4929]: E1002 11:11:22.959600 4929 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:11:22 crc kubenswrapper[4929]: E1002 11:11:22.959698 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:26.959673245 +0000 UTC m=+147.510039639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.044859 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.045028 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.045048 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.045072 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.045088 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.061000 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.061106 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.061229 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.061263 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.061282 4929 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.061355 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:27.061330916 +0000 UTC m=+147.611697320 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.061914 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.062018 4929 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.062042 4929 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.062143 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:27.06211093 +0000 UTC m=+147.612477324 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.148047 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.148106 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.148132 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.148159 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.148181 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.156588 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.156747 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.156996 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.157034 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.157002 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.157127 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.157380 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:23 crc kubenswrapper[4929]: E1002 11:11:23.157489 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.251918 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.251993 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.252012 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.252036 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.252053 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.354949 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.355057 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.355082 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.355109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.355125 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.459198 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.459578 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.459786 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.460011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.460228 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.563998 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.564108 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.564138 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.564195 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.564216 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.666457 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.666530 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.666557 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.666593 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.666617 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.770033 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.770112 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.770137 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.770164 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.770181 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.873542 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.873603 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.873621 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.873645 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.873662 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.976109 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.976178 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.976201 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.976229 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:23 crc kubenswrapper[4929]: I1002 11:11:23.976253 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:23Z","lastTransitionTime":"2025-10-02T11:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.080046 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.080119 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.080140 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.080168 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.080188 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.182643 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.182676 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.182684 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.182698 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.182707 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.284672 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.284717 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.284728 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.284744 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.284756 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.387207 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.387236 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.387245 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.387258 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.387267 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.489735 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.489782 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.489794 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.489810 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.489824 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.592728 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.592792 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.592810 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.592839 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.592857 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.696152 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.696231 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.696247 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.696272 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.696291 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.798674 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.798719 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.798734 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.798750 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.798759 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.901573 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.901604 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.901613 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.901628 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:24 crc kubenswrapper[4929]: I1002 11:11:24.901639 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:24Z","lastTransitionTime":"2025-10-02T11:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.005145 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.005297 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.005319 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.005384 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.005406 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.109241 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.109321 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.109344 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.109374 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.109400 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.156218 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.156261 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.156276 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.156217 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:25 crc kubenswrapper[4929]: E1002 11:11:25.156363 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:25 crc kubenswrapper[4929]: E1002 11:11:25.156553 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:25 crc kubenswrapper[4929]: E1002 11:11:25.156608 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:25 crc kubenswrapper[4929]: E1002 11:11:25.156704 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.212605 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.212661 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.212674 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.212693 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.212707 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.315471 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.315520 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.315536 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.315556 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.315571 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.418840 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.418887 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.418896 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.418910 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.418920 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.522011 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.522216 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.522246 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.522274 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.522311 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.625676 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.625721 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.625737 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.625758 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.625774 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.728815 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.728857 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.728871 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.728892 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.728906 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.831639 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.831697 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.831710 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.831769 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.831784 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.934948 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.935040 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.935058 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.935086 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:25 crc kubenswrapper[4929]: I1002 11:11:25.935104 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:25Z","lastTransitionTime":"2025-10-02T11:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.037552 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.037582 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.037590 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.037617 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.037626 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.139592 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.139634 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.139643 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.139660 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.139669 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.242211 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.242267 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.242283 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.242305 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.242334 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.345788 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.345861 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.345878 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.345902 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.345919 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.448650 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.448719 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.448732 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.448754 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.448766 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.552226 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.552302 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.552325 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.552356 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.552378 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.655899 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.656016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.656043 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.656081 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.656119 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.758879 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.758942 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.758984 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.759016 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.759033 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.861922 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.862018 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.862044 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.862068 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.862087 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.964422 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.964479 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.964568 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.964615 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:26 crc kubenswrapper[4929]: I1002 11:11:26.964641 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:26Z","lastTransitionTime":"2025-10-02T11:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.067292 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.067399 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.067425 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.067457 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.067482 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.156094 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.156159 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.156196 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.156252 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:27 crc kubenswrapper[4929]: E1002 11:11:27.156567 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:27 crc kubenswrapper[4929]: E1002 11:11:27.156695 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:27 crc kubenswrapper[4929]: E1002 11:11:27.156797 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:27 crc kubenswrapper[4929]: E1002 11:11:27.156898 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.171135 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.171188 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.171205 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.171226 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.171245 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.273928 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.274015 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.274032 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.274060 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.274084 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.376553 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.376649 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.376702 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.376727 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.376743 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.480263 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.480339 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.480357 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.480383 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.480401 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.583492 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.583584 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.583604 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.583638 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.583659 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.685616 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.685706 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.685719 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.685738 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.685754 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.790443 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.790510 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.790521 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.790538 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.790553 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.894326 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.894379 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.894390 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.894407 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.894418 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.900558 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.900607 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.900630 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.900661 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.900685 4929 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T11:11:27Z","lastTransitionTime":"2025-10-02T11:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.978796 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt"] Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.979647 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.983367 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.984126 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.984752 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 11:11:27 crc kubenswrapper[4929]: I1002 11:11:27.986596 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.014783 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gbz4b" podStartSLOduration=64.014744502 podStartE2EDuration="1m4.014744502s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.014111481 +0000 UTC m=+88.564477855" watchObservedRunningTime="2025-10-02 11:11:28.014744502 +0000 UTC m=+88.565110926" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.015234 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.015358 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a56b76-0341-4662-99b2-acecded5c286-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.015485 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a56b76-0341-4662-99b2-acecded5c286-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.015593 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.015696 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72a56b76-0341-4662-99b2-acecded5c286-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a56b76-0341-4662-99b2-acecded5c286-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116544 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72a56b76-0341-4662-99b2-acecded5c286-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116627 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116650 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a56b76-0341-4662-99b2-acecded5c286-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116635 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.116925 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/72a56b76-0341-4662-99b2-acecded5c286-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.118630 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72a56b76-0341-4662-99b2-acecded5c286-service-ca\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.127160 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72a56b76-0341-4662-99b2-acecded5c286-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.145608 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72a56b76-0341-4662-99b2-acecded5c286-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-6wsxt\" (UID: \"72a56b76-0341-4662-99b2-acecded5c286\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.189248 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.189228145 podStartE2EDuration="1m9.189228145s" podCreationTimestamp="2025-10-02 11:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.174304439 +0000 UTC m=+88.724670803" watchObservedRunningTime="2025-10-02 11:11:28.189228145 +0000 UTC m=+88.739594509" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.189381 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=69.189378209 podStartE2EDuration="1m9.189378209s" podCreationTimestamp="2025-10-02 11:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.188830122 +0000 UTC m=+88.739196496" watchObservedRunningTime="2025-10-02 11:11:28.189378209 +0000 UTC m=+88.739744573" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.228249 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q4fb6" podStartSLOduration=64.228228948 podStartE2EDuration="1m4.228228948s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.227439413 +0000 UTC m=+88.777805787" watchObservedRunningTime="2025-10-02 11:11:28.228228948 +0000 UTC m=+88.778595312" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.236901 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7hr2m" podStartSLOduration=64.236881294 podStartE2EDuration="1m4.236881294s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.236345827 +0000 UTC m=+88.786712191" watchObservedRunningTime="2025-10-02 11:11:28.236881294 +0000 UTC m=+88.787247658" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.247198 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lh6dc" podStartSLOduration=63.247180492 podStartE2EDuration="1m3.247180492s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.246609944 +0000 UTC m=+88.796976308" watchObservedRunningTime="2025-10-02 11:11:28.247180492 +0000 UTC m=+88.797546856" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.259387 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.259369561 podStartE2EDuration="33.259369561s" podCreationTimestamp="2025-10-02 11:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.259164354 +0000 UTC m=+88.809530728" watchObservedRunningTime="2025-10-02 11:11:28.259369561 +0000 UTC m=+88.809735925" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.270155 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.270139274 podStartE2EDuration="18.270139274s" podCreationTimestamp="2025-10-02 11:11:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.269272957 +0000 UTC m=+88.819639321" watchObservedRunningTime="2025-10-02 11:11:28.270139274 +0000 UTC m=+88.820505638" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.302865 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kxz86" podStartSLOduration=64.302843177 podStartE2EDuration="1m4.302843177s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.302355081 +0000 UTC m=+88.852721445" watchObservedRunningTime="2025-10-02 11:11:28.302843177 +0000 UTC m=+88.853209571" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.303365 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.317388 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podStartSLOduration=64.31737218 podStartE2EDuration="1m4.31737218s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.317203385 +0000 UTC m=+88.867569759" watchObservedRunningTime="2025-10-02 11:11:28.31737218 +0000 UTC m=+88.867738544" Oct 02 11:11:28 crc kubenswrapper[4929]: W1002 11:11:28.322084 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a56b76_0341_4662_99b2_acecded5c286.slice/crio-c333529e434a1dd3f6b95b8460748712db9e3c06538ebe806cd0dc93cc3ad167 WatchSource:0}: Error finding container c333529e434a1dd3f6b95b8460748712db9e3c06538ebe806cd0dc93cc3ad167: Status 404 returned error can't find the container with id c333529e434a1dd3f6b95b8460748712db9e3c06538ebe806cd0dc93cc3ad167 Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.686191 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" event={"ID":"72a56b76-0341-4662-99b2-acecded5c286","Type":"ContainerStarted","Data":"9963d0c21457f60a6cf0ff9c3c03331e6eae264b7997292deba343c09395e247"} Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.686247 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" event={"ID":"72a56b76-0341-4662-99b2-acecded5c286","Type":"ContainerStarted","Data":"c333529e434a1dd3f6b95b8460748712db9e3c06538ebe806cd0dc93cc3ad167"} Oct 02 11:11:28 crc kubenswrapper[4929]: I1002 11:11:28.704999 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-6wsxt" podStartSLOduration=64.704928545 podStartE2EDuration="1m4.704928545s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:28.703284893 +0000 UTC m=+89.253651327" watchObservedRunningTime="2025-10-02 11:11:28.704928545 +0000 UTC m=+89.255294949" Oct 02 11:11:29 crc kubenswrapper[4929]: I1002 11:11:29.156620 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:29 crc kubenswrapper[4929]: I1002 11:11:29.156645 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:29 crc kubenswrapper[4929]: I1002 11:11:29.156709 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:29 crc kubenswrapper[4929]: I1002 11:11:29.157136 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:29 crc kubenswrapper[4929]: E1002 11:11:29.158541 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:29 crc kubenswrapper[4929]: E1002 11:11:29.158645 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:29 crc kubenswrapper[4929]: E1002 11:11:29.158697 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:29 crc kubenswrapper[4929]: E1002 11:11:29.158738 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:31 crc kubenswrapper[4929]: I1002 11:11:31.155857 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:31 crc kubenswrapper[4929]: I1002 11:11:31.155889 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:31 crc kubenswrapper[4929]: I1002 11:11:31.155933 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:31 crc kubenswrapper[4929]: I1002 11:11:31.155883 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:31 crc kubenswrapper[4929]: E1002 11:11:31.156059 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:31 crc kubenswrapper[4929]: E1002 11:11:31.156437 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:31 crc kubenswrapper[4929]: E1002 11:11:31.156531 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:31 crc kubenswrapper[4929]: E1002 11:11:31.156584 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:31 crc kubenswrapper[4929]: I1002 11:11:31.156724 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:11:31 crc kubenswrapper[4929]: E1002 11:11:31.156939 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:11:33 crc kubenswrapper[4929]: I1002 11:11:33.155653 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:33 crc kubenswrapper[4929]: E1002 11:11:33.157152 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:33 crc kubenswrapper[4929]: I1002 11:11:33.155736 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:33 crc kubenswrapper[4929]: I1002 11:11:33.155765 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:33 crc kubenswrapper[4929]: I1002 11:11:33.155736 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:33 crc kubenswrapper[4929]: E1002 11:11:33.157725 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:33 crc kubenswrapper[4929]: E1002 11:11:33.157759 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:33 crc kubenswrapper[4929]: E1002 11:11:33.157555 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:34 crc kubenswrapper[4929]: I1002 11:11:34.181877 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 11:11:35 crc kubenswrapper[4929]: I1002 11:11:35.155863 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:35 crc kubenswrapper[4929]: I1002 11:11:35.155940 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:35 crc kubenswrapper[4929]: I1002 11:11:35.156081 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:35 crc kubenswrapper[4929]: E1002 11:11:35.156731 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:35 crc kubenswrapper[4929]: E1002 11:11:35.156423 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:35 crc kubenswrapper[4929]: I1002 11:11:35.156133 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:35 crc kubenswrapper[4929]: E1002 11:11:35.156944 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:35 crc kubenswrapper[4929]: E1002 11:11:35.157076 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:37 crc kubenswrapper[4929]: I1002 11:11:37.155706 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:37 crc kubenswrapper[4929]: I1002 11:11:37.155787 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:37 crc kubenswrapper[4929]: I1002 11:11:37.155864 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:37 crc kubenswrapper[4929]: E1002 11:11:37.156008 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:37 crc kubenswrapper[4929]: E1002 11:11:37.156106 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:37 crc kubenswrapper[4929]: I1002 11:11:37.156142 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:37 crc kubenswrapper[4929]: E1002 11:11:37.156242 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:37 crc kubenswrapper[4929]: E1002 11:11:37.156511 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:39 crc kubenswrapper[4929]: I1002 11:11:39.156630 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:39 crc kubenswrapper[4929]: I1002 11:11:39.156678 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:39 crc kubenswrapper[4929]: I1002 11:11:39.156675 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:39 crc kubenswrapper[4929]: I1002 11:11:39.156743 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:39 crc kubenswrapper[4929]: E1002 11:11:39.157014 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:39 crc kubenswrapper[4929]: E1002 11:11:39.157147 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:39 crc kubenswrapper[4929]: E1002 11:11:39.157273 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:39 crc kubenswrapper[4929]: E1002 11:11:39.157392 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:40 crc kubenswrapper[4929]: I1002 11:11:40.210216 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.210185239 podStartE2EDuration="6.210185239s" podCreationTimestamp="2025-10-02 11:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:40.206550393 +0000 UTC m=+100.756916757" watchObservedRunningTime="2025-10-02 11:11:40.210185239 +0000 UTC m=+100.760551643" Oct 02 11:11:41 crc kubenswrapper[4929]: I1002 11:11:41.156233 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:41 crc kubenswrapper[4929]: I1002 11:11:41.156285 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:41 crc kubenswrapper[4929]: I1002 11:11:41.156241 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:41 crc kubenswrapper[4929]: E1002 11:11:41.156523 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:41 crc kubenswrapper[4929]: I1002 11:11:41.156651 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:41 crc kubenswrapper[4929]: E1002 11:11:41.156893 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:41 crc kubenswrapper[4929]: E1002 11:11:41.157031 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:41 crc kubenswrapper[4929]: E1002 11:11:41.157150 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:43 crc kubenswrapper[4929]: I1002 11:11:43.100066 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.100282 4929 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.100373 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs podName:1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:47.100354073 +0000 UTC m=+167.650720437 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs") pod "network-metrics-daemon-59lbt" (UID: "1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 11:11:43 crc kubenswrapper[4929]: I1002 11:11:43.156065 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:43 crc kubenswrapper[4929]: I1002 11:11:43.156138 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:43 crc kubenswrapper[4929]: I1002 11:11:43.156154 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.156186 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:43 crc kubenswrapper[4929]: I1002 11:11:43.156232 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.156318 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.156362 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:43 crc kubenswrapper[4929]: E1002 11:11:43.156428 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:45 crc kubenswrapper[4929]: I1002 11:11:45.156419 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:45 crc kubenswrapper[4929]: I1002 11:11:45.156415 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:45 crc kubenswrapper[4929]: E1002 11:11:45.156717 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:45 crc kubenswrapper[4929]: I1002 11:11:45.156459 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:45 crc kubenswrapper[4929]: E1002 11:11:45.156808 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:45 crc kubenswrapper[4929]: I1002 11:11:45.156423 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:45 crc kubenswrapper[4929]: E1002 11:11:45.156895 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:45 crc kubenswrapper[4929]: E1002 11:11:45.157238 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:46 crc kubenswrapper[4929]: I1002 11:11:46.156931 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:11:46 crc kubenswrapper[4929]: E1002 11:11:46.157132 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5fzl7_openshift-ovn-kubernetes(5862ad0e-b703-4706-a7b4-25e4fdf5f78e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" Oct 02 11:11:47 crc kubenswrapper[4929]: I1002 11:11:47.155640 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:47 crc kubenswrapper[4929]: I1002 11:11:47.155668 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:47 crc kubenswrapper[4929]: E1002 11:11:47.155777 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:47 crc kubenswrapper[4929]: I1002 11:11:47.155784 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:47 crc kubenswrapper[4929]: I1002 11:11:47.155920 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:47 crc kubenswrapper[4929]: E1002 11:11:47.156040 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:47 crc kubenswrapper[4929]: E1002 11:11:47.156227 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:47 crc kubenswrapper[4929]: E1002 11:11:47.156299 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:49 crc kubenswrapper[4929]: I1002 11:11:49.156050 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:49 crc kubenswrapper[4929]: I1002 11:11:49.157243 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:49 crc kubenswrapper[4929]: E1002 11:11:49.157495 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:49 crc kubenswrapper[4929]: I1002 11:11:49.157552 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:49 crc kubenswrapper[4929]: I1002 11:11:49.157562 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:49 crc kubenswrapper[4929]: E1002 11:11:49.157663 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:49 crc kubenswrapper[4929]: E1002 11:11:49.157817 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:49 crc kubenswrapper[4929]: E1002 11:11:49.157886 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:51 crc kubenswrapper[4929]: I1002 11:11:51.155684 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:51 crc kubenswrapper[4929]: E1002 11:11:51.155822 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:51 crc kubenswrapper[4929]: I1002 11:11:51.155694 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:51 crc kubenswrapper[4929]: I1002 11:11:51.155891 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:51 crc kubenswrapper[4929]: I1002 11:11:51.156059 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:51 crc kubenswrapper[4929]: E1002 11:11:51.156156 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:51 crc kubenswrapper[4929]: E1002 11:11:51.156338 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:51 crc kubenswrapper[4929]: E1002 11:11:51.156491 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:53 crc kubenswrapper[4929]: I1002 11:11:53.156566 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:53 crc kubenswrapper[4929]: E1002 11:11:53.156693 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:53 crc kubenswrapper[4929]: I1002 11:11:53.156892 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:53 crc kubenswrapper[4929]: E1002 11:11:53.156939 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:53 crc kubenswrapper[4929]: I1002 11:11:53.157123 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:53 crc kubenswrapper[4929]: I1002 11:11:53.157131 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:53 crc kubenswrapper[4929]: E1002 11:11:53.157180 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:53 crc kubenswrapper[4929]: E1002 11:11:53.157280 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:55 crc kubenswrapper[4929]: I1002 11:11:55.155889 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:55 crc kubenswrapper[4929]: E1002 11:11:55.156053 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:55 crc kubenswrapper[4929]: I1002 11:11:55.155895 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:55 crc kubenswrapper[4929]: I1002 11:11:55.155895 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:55 crc kubenswrapper[4929]: E1002 11:11:55.156131 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:55 crc kubenswrapper[4929]: I1002 11:11:55.155914 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:55 crc kubenswrapper[4929]: E1002 11:11:55.156205 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:55 crc kubenswrapper[4929]: E1002 11:11:55.156259 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:57 crc kubenswrapper[4929]: I1002 11:11:57.156156 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:57 crc kubenswrapper[4929]: I1002 11:11:57.156206 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:57 crc kubenswrapper[4929]: I1002 11:11:57.156206 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:57 crc kubenswrapper[4929]: I1002 11:11:57.156172 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:57 crc kubenswrapper[4929]: E1002 11:11:57.156360 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:57 crc kubenswrapper[4929]: E1002 11:11:57.156497 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:57 crc kubenswrapper[4929]: E1002 11:11:57.156623 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:57 crc kubenswrapper[4929]: E1002 11:11:57.156782 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.156615 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.156626 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:11:59 crc kubenswrapper[4929]: E1002 11:11:59.157065 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.156615 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:11:59 crc kubenswrapper[4929]: E1002 11:11:59.157143 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.156670 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:11:59 crc kubenswrapper[4929]: E1002 11:11:59.157216 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:11:59 crc kubenswrapper[4929]: E1002 11:11:59.157266 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.800049 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/1.log" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.800942 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/0.log" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.801049 4929 generic.go:334] "Generic (PLEG): container finished" podID="4599e863-12c0-4c39-a873-a46012459555" containerID="d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50" exitCode=1 Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.801108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerDied","Data":"d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50"} Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.801203 4929 scope.go:117] "RemoveContainer" containerID="91c5c764a9a06a9d380f9e47e513862d31f9d291172f476ac372f436949b4b64" Oct 02 11:11:59 crc kubenswrapper[4929]: I1002 11:11:59.801796 4929 scope.go:117] "RemoveContainer" containerID="d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50" Oct 02 11:11:59 crc kubenswrapper[4929]: E1002 11:11:59.802066 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gbz4b_openshift-multus(4599e863-12c0-4c39-a873-a46012459555)\"" pod="openshift-multus/multus-gbz4b" podUID="4599e863-12c0-4c39-a873-a46012459555" Oct 02 11:12:00 crc kubenswrapper[4929]: E1002 11:12:00.148542 4929 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 11:12:00 crc kubenswrapper[4929]: I1002 11:12:00.160765 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:12:00 crc kubenswrapper[4929]: E1002 11:12:00.287477 4929 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:12:00 crc kubenswrapper[4929]: I1002 11:12:00.807732 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/3.log" Oct 02 11:12:00 crc kubenswrapper[4929]: I1002 11:12:00.811327 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerStarted","Data":"2f40b19688c14633cea5dde4e9ee1cf384439a3fabd89a65b3a3b3da215f6d97"} Oct 02 11:12:00 crc kubenswrapper[4929]: I1002 11:12:00.812240 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:12:00 crc kubenswrapper[4929]: I1002 11:12:00.813322 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/1.log" Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.061301 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podStartSLOduration=97.061278108 podStartE2EDuration="1m37.061278108s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:00.851886572 +0000 UTC m=+121.402252946" watchObservedRunningTime="2025-10-02 11:12:01.061278108 +0000 UTC m=+121.611644472" Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.062368 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-59lbt"] Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.062549 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:01 crc kubenswrapper[4929]: E1002 11:12:01.062711 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.166041 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.166260 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:01 crc kubenswrapper[4929]: I1002 11:12:01.166112 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:01 crc kubenswrapper[4929]: E1002 11:12:01.166387 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:01 crc kubenswrapper[4929]: E1002 11:12:01.166581 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:01 crc kubenswrapper[4929]: E1002 11:12:01.166841 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:03 crc kubenswrapper[4929]: I1002 11:12:03.155739 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:03 crc kubenswrapper[4929]: I1002 11:12:03.155808 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:03 crc kubenswrapper[4929]: I1002 11:12:03.155883 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:03 crc kubenswrapper[4929]: I1002 11:12:03.155739 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:03 crc kubenswrapper[4929]: E1002 11:12:03.156047 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:03 crc kubenswrapper[4929]: E1002 11:12:03.156168 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:03 crc kubenswrapper[4929]: E1002 11:12:03.156456 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:03 crc kubenswrapper[4929]: E1002 11:12:03.156577 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:05 crc kubenswrapper[4929]: I1002 11:12:05.156434 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:05 crc kubenswrapper[4929]: I1002 11:12:05.156546 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:05 crc kubenswrapper[4929]: E1002 11:12:05.156639 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:05 crc kubenswrapper[4929]: I1002 11:12:05.156654 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:05 crc kubenswrapper[4929]: I1002 11:12:05.156718 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:05 crc kubenswrapper[4929]: E1002 11:12:05.156820 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:05 crc kubenswrapper[4929]: E1002 11:12:05.156917 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:05 crc kubenswrapper[4929]: E1002 11:12:05.156996 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:05 crc kubenswrapper[4929]: E1002 11:12:05.289146 4929 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:12:07 crc kubenswrapper[4929]: I1002 11:12:07.156514 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:07 crc kubenswrapper[4929]: I1002 11:12:07.156543 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:07 crc kubenswrapper[4929]: E1002 11:12:07.156687 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:07 crc kubenswrapper[4929]: I1002 11:12:07.156570 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:07 crc kubenswrapper[4929]: E1002 11:12:07.156756 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:07 crc kubenswrapper[4929]: I1002 11:12:07.156550 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:07 crc kubenswrapper[4929]: E1002 11:12:07.156938 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:07 crc kubenswrapper[4929]: E1002 11:12:07.157005 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:09 crc kubenswrapper[4929]: I1002 11:12:09.156415 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:09 crc kubenswrapper[4929]: I1002 11:12:09.156416 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:09 crc kubenswrapper[4929]: I1002 11:12:09.156428 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:09 crc kubenswrapper[4929]: E1002 11:12:09.156640 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:09 crc kubenswrapper[4929]: I1002 11:12:09.156900 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:09 crc kubenswrapper[4929]: E1002 11:12:09.156908 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:09 crc kubenswrapper[4929]: E1002 11:12:09.157033 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:09 crc kubenswrapper[4929]: E1002 11:12:09.157193 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:10 crc kubenswrapper[4929]: E1002 11:12:10.290009 4929 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:12:11 crc kubenswrapper[4929]: I1002 11:12:11.156304 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:11 crc kubenswrapper[4929]: I1002 11:12:11.156384 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:11 crc kubenswrapper[4929]: I1002 11:12:11.156336 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:11 crc kubenswrapper[4929]: I1002 11:12:11.156321 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:11 crc kubenswrapper[4929]: E1002 11:12:11.156506 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:11 crc kubenswrapper[4929]: E1002 11:12:11.156617 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:11 crc kubenswrapper[4929]: E1002 11:12:11.156658 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:11 crc kubenswrapper[4929]: E1002 11:12:11.156711 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:13 crc kubenswrapper[4929]: I1002 11:12:13.155603 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:13 crc kubenswrapper[4929]: I1002 11:12:13.155600 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:13 crc kubenswrapper[4929]: I1002 11:12:13.155622 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:13 crc kubenswrapper[4929]: I1002 11:12:13.155711 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:13 crc kubenswrapper[4929]: E1002 11:12:13.155888 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:13 crc kubenswrapper[4929]: E1002 11:12:13.156012 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:13 crc kubenswrapper[4929]: E1002 11:12:13.156073 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:13 crc kubenswrapper[4929]: E1002 11:12:13.156285 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.156350 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.156363 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.156420 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.156490 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:15 crc kubenswrapper[4929]: E1002 11:12:15.156763 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.156910 4929 scope.go:117] "RemoveContainer" containerID="d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50" Oct 02 11:12:15 crc kubenswrapper[4929]: E1002 11:12:15.156909 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:15 crc kubenswrapper[4929]: E1002 11:12:15.156937 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:15 crc kubenswrapper[4929]: E1002 11:12:15.156998 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:15 crc kubenswrapper[4929]: E1002 11:12:15.291632 4929 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.866565 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/1.log" Oct 02 11:12:15 crc kubenswrapper[4929]: I1002 11:12:15.866688 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerStarted","Data":"0f80d407099e4b88f75718fb6019a8896ef25529261ce8f5fc038aacb4dd4c0e"} Oct 02 11:12:17 crc kubenswrapper[4929]: I1002 11:12:17.155765 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:17 crc kubenswrapper[4929]: I1002 11:12:17.155877 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:17 crc kubenswrapper[4929]: I1002 11:12:17.155796 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:17 crc kubenswrapper[4929]: I1002 11:12:17.156010 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:17 crc kubenswrapper[4929]: E1002 11:12:17.156097 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:17 crc kubenswrapper[4929]: E1002 11:12:17.156261 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:17 crc kubenswrapper[4929]: E1002 11:12:17.156363 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:17 crc kubenswrapper[4929]: E1002 11:12:17.156479 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:19 crc kubenswrapper[4929]: I1002 11:12:19.156493 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:19 crc kubenswrapper[4929]: I1002 11:12:19.156551 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:19 crc kubenswrapper[4929]: I1002 11:12:19.156549 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:19 crc kubenswrapper[4929]: I1002 11:12:19.156529 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:19 crc kubenswrapper[4929]: E1002 11:12:19.156734 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-59lbt" podUID="1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49" Oct 02 11:12:19 crc kubenswrapper[4929]: E1002 11:12:19.156855 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 11:12:19 crc kubenswrapper[4929]: E1002 11:12:19.157005 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 11:12:19 crc kubenswrapper[4929]: E1002 11:12:19.157151 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.155553 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.155589 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.155656 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.155746 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.158415 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.158825 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.159592 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.159670 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.159911 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 11:12:21 crc kubenswrapper[4929]: I1002 11:12:21.160156 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 11:12:26 crc kubenswrapper[4929]: I1002 11:12:26.993196 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:26 crc kubenswrapper[4929]: E1002 11:12:26.993441 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:14:28.993395502 +0000 UTC m=+269.543761906 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:26 crc kubenswrapper[4929]: I1002 11:12:26.993701 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:26 crc kubenswrapper[4929]: I1002 11:12:26.993748 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:26 crc kubenswrapper[4929]: I1002 11:12:26.995328 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.002904 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.095286 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.095437 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.101445 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.101541 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.176725 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.202374 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.209132 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 11:12:27 crc kubenswrapper[4929]: W1002 11:12:27.691501 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-628a3ba865d210ea6ff82d99b1938fec6b95e64200a1dfe8040f5211303f1ccb WatchSource:0}: Error finding container 628a3ba865d210ea6ff82d99b1938fec6b95e64200a1dfe8040f5211303f1ccb: Status 404 returned error can't find the container with id 628a3ba865d210ea6ff82d99b1938fec6b95e64200a1dfe8040f5211303f1ccb Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.914093 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c082d7ad4c4f4ef17af4833f6865ec43a91331189431b237c0a7638eb5ade4df"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.914140 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d687e2c08a4fbd51cef4ee06c67e1edd78ff0c392860c5a00a527bfb37b33c62"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.915700 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a87d0da5fdfdbf5053c7903fd670988c5ca457a34a63830d0a0a9dcd1ea2ab67"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.915786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"628a3ba865d210ea6ff82d99b1938fec6b95e64200a1dfe8040f5211303f1ccb"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.917388 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ba43330fa9874a55539a3dadd1847d1f1621e0515380c45b3cb0bf2096d82e45"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.917449 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6df66c3e05c1634787f7e962f628a6d1da9ee6e8c8761a61c81720b5d2517868"} Oct 02 11:12:27 crc kubenswrapper[4929]: I1002 11:12:27.917770 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.148908 4929 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.190047 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.190769 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.190837 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.191392 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.191694 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hzmkl"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.192468 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.193195 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.193584 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.193908 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.194263 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.194356 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.196687 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.202857 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xvv6"] Oct 02 11:12:29 crc kubenswrapper[4929]: W1002 11:12:29.211915 4929 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 02 11:12:29 crc kubenswrapper[4929]: E1002 11:12:29.211995 4929 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:12:29 crc kubenswrapper[4929]: W1002 11:12:29.212119 4929 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Oct 02 11:12:29 crc kubenswrapper[4929]: E1002 11:12:29.212138 4929 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.212679 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.215119 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.215884 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.217347 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218839 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218879 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-policies\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218905 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218921 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218935 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-encryption-config\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.218979 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-encryption-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219005 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219051 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219092 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219111 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219133 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219165 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219180 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqj4\" (UniqueName: \"kubernetes.io/projected/1f604268-ce5c-4822-be72-1fe5615cf7bc-kube-api-access-mtqj4\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219195 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219213 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-node-pullsecrets\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219235 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219255 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219272 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-image-import-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219286 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh6ns\" (UniqueName: \"kubernetes.io/projected/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-kube-api-access-dh6ns\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219301 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-dir\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219321 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219361 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-serving-cert\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219385 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219402 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-client\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219445 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219467 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkg6d\" (UniqueName: \"kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219489 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-serving-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219511 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219526 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-serving-cert\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219543 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219557 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219571 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219586 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219602 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219617 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslb8\" (UniqueName: \"kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219634 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219653 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit-dir\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219711 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4kt\" (UniqueName: \"kubernetes.io/projected/c24e4fe9-f50e-4989-9316-e17e05b64acc-kube-api-access-jx4kt\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219744 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-client\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219759 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnc2\" (UniqueName: \"kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.219807 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.223487 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.224446 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.224688 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.224745 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.224899 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225115 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225162 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225258 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225512 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225561 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225600 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225522 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225702 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225731 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225817 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.225910 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.227204 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gpcbf"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.227860 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.228038 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.229226 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.229355 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.229515 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.229702 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.229878 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.230994 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.231206 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.232563 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.232742 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.232839 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.232903 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233033 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233091 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233132 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233267 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233407 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233438 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233593 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233729 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233835 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233847 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jxdhx"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.233990 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234030 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234162 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234277 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234397 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234505 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234603 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234675 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.234699 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235041 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235197 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235233 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235286 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235438 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235432 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235672 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235796 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.235804 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.237519 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qxkdv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.237877 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.238283 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.239334 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.240059 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.243048 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.243345 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.248120 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.248684 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.249397 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.249629 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250369 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250524 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250662 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250697 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250815 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.250908 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.251243 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.251447 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.251608 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.251819 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.252256 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.253095 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.254275 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.254905 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.257503 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.258646 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.258804 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.258946 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259101 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259151 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259768 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259853 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259883 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260062 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.259778 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260158 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260206 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260064 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260334 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.260279 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.261000 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqrd9"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.261614 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.262089 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.262870 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.263617 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wxj9k"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.264144 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.264903 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4hh6l"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.265772 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.267184 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.267697 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.271759 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.283701 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.285420 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.304304 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.304477 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.304690 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.304899 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.304920 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.307468 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.308503 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.312826 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.317096 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.317303 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qxjwx"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.317849 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.323451 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.328668 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.328838 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.330360 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.330610 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.330939 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.333484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334231 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.331534 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334285 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44mb\" (UniqueName: \"kubernetes.io/projected/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-kube-api-access-z44mb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334330 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-config\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334347 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334363 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-metrics-certs\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334400 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqj4\" (UniqueName: \"kubernetes.io/projected/1f604268-ce5c-4822-be72-1fe5615cf7bc-kube-api-access-mtqj4\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334417 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334434 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpwtw\" (UniqueName: \"kubernetes.io/projected/ac426e70-652b-44e7-83ff-3a6c26942921-kube-api-access-tpwtw\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334453 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-node-pullsecrets\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.334259 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.335427 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.335617 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-node-pullsecrets\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.335625 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336142 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336280 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336341 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz64f\" (UniqueName: \"kubernetes.io/projected/aa6fb363-0b33-4217-82df-61e525432168-kube-api-access-kz64f\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336367 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6kxq\" (UniqueName: \"kubernetes.io/projected/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-kube-api-access-s6kxq\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336403 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336442 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.336659 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337329 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337364 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337444 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-config\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337472 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337490 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337512 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-default-certificate\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337532 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-image-import-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337548 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh6ns\" (UniqueName: \"kubernetes.io/projected/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-kube-api-access-dh6ns\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337559 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337610 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-dir\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337629 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337654 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04489d51-a83f-4152-ac0c-d5a46898e21a-proxy-tls\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337672 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04489d51-a83f-4152-ac0c-d5a46898e21a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337691 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.337805 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-dir\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.338301 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-image-import-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.338807 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.338878 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-serving-cert\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.338905 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.344723 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.344938 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.346196 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.352791 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.354249 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.355362 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361082 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clztp\" (UniqueName: \"kubernetes.io/projected/d6016b7e-1424-416e-bcd9-a3490eec1493-kube-api-access-clztp\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361665 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac426e70-652b-44e7-83ff-3a6c26942921-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361807 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-client\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361828 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361881 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361904 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqnqw\" (UniqueName: \"kubernetes.io/projected/313a7d22-f379-4a34-8797-77a77c0ddc97-kube-api-access-lqnqw\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361921 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.361950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrxdj\" (UniqueName: \"kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362025 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkr9l\" (UniqueName: \"kubernetes.io/projected/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-kube-api-access-mkr9l\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362055 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkg6d\" (UniqueName: \"kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362653 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362833 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-serving-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362868 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa6fb363-0b33-4217-82df-61e525432168-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.362892 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4809e56-3d51-4508-954c-df41db145ee7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.363790 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.363839 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhxs\" (UniqueName: \"kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.363898 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.363925 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-serving-cert\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.363949 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062edadc-90e2-4fe6-9073-d8a32ba6b345-serving-cert\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-serving-ca\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364039 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364197 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-cabundle\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364283 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364332 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364391 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qckgc\" (UniqueName: \"kubernetes.io/projected/062edadc-90e2-4fe6-9073-d8a32ba6b345-kube-api-access-qckgc\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364424 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-serving-cert\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364443 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364474 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364492 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.364515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslb8\" (UniqueName: \"kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365162 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365426 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c24e4fe9-f50e-4989-9316-e17e05b64acc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365479 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365511 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fcp5\" (UniqueName: \"kubernetes.io/projected/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-kube-api-access-8fcp5\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365660 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365703 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365734 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-images\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365757 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365774 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h97s\" (UniqueName: \"kubernetes.io/projected/edb0db23-72e4-4288-b8e6-84df2e5091b3-kube-api-access-5h97s\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365799 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365820 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-trusted-ca\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365857 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-service-ca-bundle\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365874 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-client\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365895 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit-dir\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4kt\" (UniqueName: \"kubernetes.io/projected/c24e4fe9-f50e-4989-9316-e17e05b64acc-kube-api-access-jx4kt\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365924 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.365939 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366032 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366085 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-config\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366115 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-srv-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366132 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-stats-auth\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366149 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4809e56-3d51-4508-954c-df41db145ee7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366128 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.366383 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-serving-cert\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.353315 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.371180 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.371660 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.372470 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.376452 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.376610 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.376859 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-etcd-client\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.376928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.377219 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-etcd-client\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.377416 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.377982 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-config\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378076 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnc2\" (UniqueName: \"kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378102 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c24e4fe9-f50e-4989-9316-e17e05b64acc-audit-dir\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378115 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378265 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378583 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-service-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378675 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-serving-cert\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378655 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378771 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378819 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378829 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-key\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378922 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-policies\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.378951 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tv6q\" (UniqueName: \"kubernetes.io/projected/04489d51-a83f-4152-ac0c-d5a46898e21a-kube-api-access-7tv6q\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.379299 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.379620 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f604268-ce5c-4822-be72-1fe5615cf7bc-audit-policies\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.379871 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.379927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.380775 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.380817 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-encryption-config\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.380845 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-serving-cert\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.380920 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381211 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvw4q\" (UniqueName: \"kubernetes.io/projected/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-kube-api-access-vvw4q\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381522 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381721 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbd6m\" (UniqueName: \"kubernetes.io/projected/854dc2ee-5769-484e-a9a4-69a592dcaac1-kube-api-access-cbd6m\") pod \"downloads-7954f5f757-qxkdv\" (UID: \"854dc2ee-5769-484e-a9a4-69a592dcaac1\") " pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381755 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381777 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381796 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac426e70-652b-44e7-83ff-3a6c26942921-serving-cert\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381815 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381830 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381849 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6016b7e-1424-416e-bcd9-a3490eec1493-metrics-tls\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381873 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-encryption-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381894 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381915 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2g5\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-kube-api-access-lv2g5\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381932 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.381950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-client\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.382383 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.384719 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.384776 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.385866 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.395679 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.397611 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c24e4fe9-f50e-4989-9316-e17e05b64acc-encryption-config\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.400406 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.403455 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1f604268-ce5c-4822-be72-1fe5615cf7bc-encryption-config\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.403541 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.404054 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.404181 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.404231 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.405424 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.415701 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.430327 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.431309 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.440517 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.447884 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.448562 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.449155 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.450663 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.451133 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.451234 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.452499 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.453284 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.453783 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.454029 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.454129 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.455363 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pxt8j"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.456243 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.456361 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.456636 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.457867 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.460481 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gpcbf"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.462433 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.463527 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hzmkl"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.464603 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.468662 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jxdhx"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.468691 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kw7jx"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.469910 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.470023 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.472581 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.473770 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.475107 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqrd9"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.476062 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.476233 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.478082 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.479045 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xvv6"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.480358 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.481835 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wxj9k"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483343 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483627 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-service-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483658 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-key\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483680 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tv6q\" (UniqueName: \"kubernetes.io/projected/04489d51-a83f-4152-ac0c-d5a46898e21a-kube-api-access-7tv6q\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483698 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-serving-cert\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483713 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvw4q\" (UniqueName: \"kubernetes.io/projected/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-kube-api-access-vvw4q\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483755 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbd6m\" (UniqueName: \"kubernetes.io/projected/854dc2ee-5769-484e-a9a4-69a592dcaac1-kube-api-access-cbd6m\") pod \"downloads-7954f5f757-qxkdv\" (UID: \"854dc2ee-5769-484e-a9a4-69a592dcaac1\") " pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483773 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483790 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac426e70-652b-44e7-83ff-3a6c26942921-serving-cert\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483805 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483822 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483839 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6016b7e-1424-416e-bcd9-a3490eec1493-metrics-tls\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483856 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2g5\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-kube-api-access-lv2g5\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483872 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483887 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-client\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.483984 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44mb\" (UniqueName: \"kubernetes.io/projected/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-kube-api-access-z44mb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484108 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-config\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484140 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484164 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-metrics-certs\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484208 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpwtw\" (UniqueName: \"kubernetes.io/projected/ac426e70-652b-44e7-83ff-3a6c26942921-kube-api-access-tpwtw\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484232 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz64f\" (UniqueName: \"kubernetes.io/projected/aa6fb363-0b33-4217-82df-61e525432168-kube-api-access-kz64f\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6kxq\" (UniqueName: \"kubernetes.io/projected/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-kube-api-access-s6kxq\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484274 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-config\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484364 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485285 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-config\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485392 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485433 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.484413 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485495 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-default-certificate\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485529 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04489d51-a83f-4152-ac0c-d5a46898e21a-proxy-tls\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485546 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04489d51-a83f-4152-ac0c-d5a46898e21a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485564 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485583 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clztp\" (UniqueName: \"kubernetes.io/projected/d6016b7e-1424-416e-bcd9-a3490eec1493-kube-api-access-clztp\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485602 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac426e70-652b-44e7-83ff-3a6c26942921-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485625 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqnqw\" (UniqueName: \"kubernetes.io/projected/313a7d22-f379-4a34-8797-77a77c0ddc97-kube-api-access-lqnqw\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485653 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485675 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrxdj\" (UniqueName: \"kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485690 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkr9l\" (UniqueName: \"kubernetes.io/projected/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-kube-api-access-mkr9l\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485715 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa6fb363-0b33-4217-82df-61e525432168-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485731 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4809e56-3d51-4508-954c-df41db145ee7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485748 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485763 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhxs\" (UniqueName: \"kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485777 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062edadc-90e2-4fe6-9073-d8a32ba6b345-serving-cert\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485793 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485809 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-cabundle\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485831 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qckgc\" (UniqueName: \"kubernetes.io/projected/062edadc-90e2-4fe6-9073-d8a32ba6b345-kube-api-access-qckgc\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485847 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-serving-cert\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485868 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fcp5\" (UniqueName: \"kubernetes.io/projected/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-kube-api-access-8fcp5\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485882 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485899 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485914 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-images\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485931 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.485948 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h97s\" (UniqueName: \"kubernetes.io/projected/edb0db23-72e4-4288-b8e6-84df2e5091b3-kube-api-access-5h97s\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486684 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-trusted-ca\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486716 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486733 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-service-ca-bundle\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486756 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486772 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-config\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486786 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-srv-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486805 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-stats-auth\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486822 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4809e56-3d51-4508-954c-df41db145ee7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486846 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-config\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.487138 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.487240 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04489d51-a83f-4152-ac0c-d5a46898e21a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.487506 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.487533 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ac426e70-652b-44e7-83ff-3a6c26942921-available-featuregates\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.488910 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/062edadc-90e2-4fe6-9073-d8a32ba6b345-trusted-ca\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486251 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-config\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.489803 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.489863 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.490040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.490598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4809e56-3d51-4508-954c-df41db145ee7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.490710 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-cabundle\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.490738 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-config\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.490897 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491445 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/aa6fb363-0b33-4217-82df-61e525432168-images\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491469 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa6fb363-0b33-4217-82df-61e525432168-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486647 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491794 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062edadc-90e2-4fe6-9073-d8a32ba6b345-serving-cert\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491798 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491858 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qxkdv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491872 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.491897 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac426e70-652b-44e7-83ff-3a6c26942921-serving-cert\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.486208 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.492837 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.493009 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4809e56-3d51-4508-954c-df41db145ee7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.493792 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.494089 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.495058 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.495362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.495987 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.496130 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/edb0db23-72e4-4288-b8e6-84df2e5091b3-srv-cert\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.496175 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4hh6l"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.496345 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.496448 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-serving-cert\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.496760 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6016b7e-1424-416e-bcd9-a3490eec1493-metrics-tls\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.497384 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.498636 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zjjls"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.498668 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-signing-key\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.499702 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.500180 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-sqw5b"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.500659 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.501255 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.501403 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.503305 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.508057 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kw7jx"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.509816 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.512337 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.512427 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.522377 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.524864 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.525875 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.526689 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pxt8j"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.527902 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjjls"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.528874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4sh99"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.529529 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.530034 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4sh99"] Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.533705 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.536540 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-serving-cert\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.553352 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.557723 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-client\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.573177 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.574387 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-etcd-service-ca\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.593129 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.612951 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.633349 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.643134 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313a7d22-f379-4a34-8797-77a77c0ddc97-config\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.653491 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.656367 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.673116 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.693118 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.712627 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.720970 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04489d51-a83f-4152-ac0c-d5a46898e21a-proxy-tls\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.755082 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.768654 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-metrics-certs\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.772856 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.792936 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.799212 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-default-certificate\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.813302 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.823371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-stats-auth\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.833684 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.853380 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.872803 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.883220 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-service-ca-bundle\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.893720 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.912553 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.932469 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.953105 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.987062 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqj4\" (UniqueName: \"kubernetes.io/projected/1f604268-ce5c-4822-be72-1fe5615cf7bc-kube-api-access-mtqj4\") pod \"apiserver-7bbb656c7d-mpbk4\" (UID: \"1f604268-ce5c-4822-be72-1fe5615cf7bc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:29 crc kubenswrapper[4929]: I1002 11:12:29.993387 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.013885 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.033499 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.053575 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.092877 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.094972 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh6ns\" (UniqueName: \"kubernetes.io/projected/80c38d88-6af4-45a3-bce7-8ad07c8e4e1d-kube-api-access-dh6ns\") pod \"openshift-apiserver-operator-796bbdcf4f-z4jvz\" (UID: \"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.113410 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.134992 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.151756 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkg6d\" (UniqueName: \"kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d\") pod \"route-controller-manager-6576b87f9c-l9jm7\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.160473 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.171069 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslb8\" (UniqueName: \"kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8\") pod \"oauth-openshift-558db77b4-cf6n9\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.184824 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.187754 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4kt\" (UniqueName: \"kubernetes.io/projected/c24e4fe9-f50e-4989-9316-e17e05b64acc-kube-api-access-jx4kt\") pod \"apiserver-76f77b778f-hzmkl\" (UID: \"c24e4fe9-f50e-4989-9316-e17e05b64acc\") " pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.195257 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.196998 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.213900 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.233829 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.254586 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.273642 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.306213 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnc2\" (UniqueName: \"kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.311826 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz"] Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.313320 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.333891 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.353761 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: E1002 11:12:30.365697 4929 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.365721 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:12:30 crc kubenswrapper[4929]: E1002 11:12:30.365778 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles podName:923649eb-ddfc-4299-b94e-3f549a863233 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:30.865755072 +0000 UTC m=+151.416121436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles") pod "controller-manager-879f6c89f-7wfcz" (UID: "923649eb-ddfc-4299-b94e-3f549a863233") : failed to sync configmap cache: timed out waiting for the condition Oct 02 11:12:30 crc kubenswrapper[4929]: W1002 11:12:30.370761 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc619b40e_e812_4689_aa5d_3ad89ec57afc.slice/crio-aa7ca4073e16e332a9351a2f6a06bc27dc8abbe32c130e16f3fd1d3d4bdfa926 WatchSource:0}: Error finding container aa7ca4073e16e332a9351a2f6a06bc27dc8abbe32c130e16f3fd1d3d4bdfa926: Status 404 returned error can't find the container with id aa7ca4073e16e332a9351a2f6a06bc27dc8abbe32c130e16f3fd1d3d4bdfa926 Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.372851 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.392184 4929 request.go:700] Waited for 1.007103334s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.393836 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.396915 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4"] Oct 02 11:12:30 crc kubenswrapper[4929]: W1002 11:12:30.408312 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f604268_ce5c_4822_be72_1fe5615cf7bc.slice/crio-6ed4298a06ace27700a06472db45e35e42009ffcfc80dfae8a8ac8b2a273fc1a WatchSource:0}: Error finding container 6ed4298a06ace27700a06472db45e35e42009ffcfc80dfae8a8ac8b2a273fc1a: Status 404 returned error can't find the container with id 6ed4298a06ace27700a06472db45e35e42009ffcfc80dfae8a8ac8b2a273fc1a Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.421682 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.428419 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.433328 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.442768 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.452538 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.473623 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.494000 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.512456 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.532887 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.552983 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.572984 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.593866 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.613055 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.620744 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-hzmkl"] Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.636406 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.659788 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.674344 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.713877 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.733577 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.753094 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.773779 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.793687 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.812946 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.833744 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.854334 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.873232 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.893361 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.907120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.913169 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.931793 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" event={"ID":"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d","Type":"ContainerStarted","Data":"2834cc0ddcffbdf273d7a30776eb6d9378990e580dbf5420f071ab20fc1db8a9"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.931836 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" event={"ID":"80c38d88-6af4-45a3-bce7-8ad07c8e4e1d","Type":"ContainerStarted","Data":"3647d152b964752965eba35159bc07f5457a75941460ad77bb62f048512e5965"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.933044 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.933760 4929 generic.go:334] "Generic (PLEG): container finished" podID="c24e4fe9-f50e-4989-9316-e17e05b64acc" containerID="ecef6b71c75def2299cfe465dc785e16242c1dfb1e83d1c2df6637a61865ad39" exitCode=0 Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.933812 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" event={"ID":"c24e4fe9-f50e-4989-9316-e17e05b64acc","Type":"ContainerDied","Data":"ecef6b71c75def2299cfe465dc785e16242c1dfb1e83d1c2df6637a61865ad39"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.933844 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" event={"ID":"c24e4fe9-f50e-4989-9316-e17e05b64acc","Type":"ContainerStarted","Data":"8425314eb0e9edb0bbc1fa892bb25294b511df26786cacad8df8cdb9b58501d9"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.935543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" event={"ID":"c619b40e-e812-4689-aa5d-3ad89ec57afc","Type":"ContainerStarted","Data":"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.935574 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" event={"ID":"c619b40e-e812-4689-aa5d-3ad89ec57afc","Type":"ContainerStarted","Data":"aa7ca4073e16e332a9351a2f6a06bc27dc8abbe32c130e16f3fd1d3d4bdfa926"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.935870 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.938292 4929 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-l9jm7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.938337 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.938996 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" event={"ID":"9a73326b-13cd-4b69-b17b-93cd1b59679c","Type":"ContainerStarted","Data":"4b1cd896ad216c083beb89bd7e20943ae0ac3510f6f80ff8577da254951c39fa"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.939036 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" event={"ID":"9a73326b-13cd-4b69-b17b-93cd1b59679c","Type":"ContainerStarted","Data":"be078ba360ac9d00252183dacd3bb1e75b8cc5af47a38a01c86a412b2d40c65d"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.939198 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.940451 4929 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cf6n9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.940509 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.940717 4929 generic.go:334] "Generic (PLEG): container finished" podID="1f604268-ce5c-4822-be72-1fe5615cf7bc" containerID="04bf467001a4177f9785f7cd0e3607ad9a2acd415bda014fbda39558e4bec6da" exitCode=0 Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.940759 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" event={"ID":"1f604268-ce5c-4822-be72-1fe5615cf7bc","Type":"ContainerDied","Data":"04bf467001a4177f9785f7cd0e3607ad9a2acd415bda014fbda39558e4bec6da"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.940779 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" event={"ID":"1f604268-ce5c-4822-be72-1fe5615cf7bc","Type":"ContainerStarted","Data":"6ed4298a06ace27700a06472db45e35e42009ffcfc80dfae8a8ac8b2a273fc1a"} Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.952831 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.972663 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 11:12:30 crc kubenswrapper[4929]: I1002 11:12:30.994715 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.014706 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.034189 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.054553 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.072811 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.092950 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.113107 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.134009 4929 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.153785 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.191002 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tv6q\" (UniqueName: \"kubernetes.io/projected/04489d51-a83f-4152-ac0c-d5a46898e21a-kube-api-access-7tv6q\") pod \"machine-config-controller-84d6567774-ncnkh\" (UID: \"04489d51-a83f-4152-ac0c-d5a46898e21a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.205819 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbd6m\" (UniqueName: \"kubernetes.io/projected/854dc2ee-5769-484e-a9a4-69a592dcaac1-kube-api-access-cbd6m\") pod \"downloads-7954f5f757-qxkdv\" (UID: \"854dc2ee-5769-484e-a9a4-69a592dcaac1\") " pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.229993 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvw4q\" (UniqueName: \"kubernetes.io/projected/d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e-kube-api-access-vvw4q\") pod \"service-ca-9c57cc56f-wxj9k\" (UID: \"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e\") " pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.253261 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2g5\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-kube-api-access-lv2g5\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.254227 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.270573 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz64f\" (UniqueName: \"kubernetes.io/projected/aa6fb363-0b33-4217-82df-61e525432168-kube-api-access-kz64f\") pod \"machine-api-operator-5694c8668f-6xvv6\" (UID: \"aa6fb363-0b33-4217-82df-61e525432168\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.295015 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpwtw\" (UniqueName: \"kubernetes.io/projected/ac426e70-652b-44e7-83ff-3a6c26942921-kube-api-access-tpwtw\") pod \"openshift-config-operator-7777fb866f-j8pwm\" (UID: \"ac426e70-652b-44e7-83ff-3a6c26942921\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.312304 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6kxq\" (UniqueName: \"kubernetes.io/projected/a34747b6-0f96-47d6-a12b-7fb4c40b5bab-kube-api-access-s6kxq\") pod \"router-default-5444994796-qxjwx\" (UID: \"a34747b6-0f96-47d6-a12b-7fb4c40b5bab\") " pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.327800 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.336777 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44mb\" (UniqueName: \"kubernetes.io/projected/6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5-kube-api-access-z44mb\") pod \"openshift-controller-manager-operator-756b6f6bc6-rhnwv\" (UID: \"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.363492 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.363717 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.373578 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqnqw\" (UniqueName: \"kubernetes.io/projected/313a7d22-f379-4a34-8797-77a77c0ddc97-kube-api-access-lqnqw\") pod \"etcd-operator-b45778765-4hh6l\" (UID: \"313a7d22-f379-4a34-8797-77a77c0ddc97\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.392395 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhxs\" (UniqueName: \"kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs\") pod \"console-f9d7485db-zc6nf\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.392398 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clztp\" (UniqueName: \"kubernetes.io/projected/d6016b7e-1424-416e-bcd9-a3490eec1493-kube-api-access-clztp\") pod \"dns-operator-744455d44c-nqrd9\" (UID: \"d6016b7e-1424-416e-bcd9-a3490eec1493\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.411595 4929 request.go:700] Waited for 1.924238553s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/collect-profiles/token Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.413557 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4809e56-3d51-4508-954c-df41db145ee7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g28s8\" (UID: \"b4809e56-3d51-4508-954c-df41db145ee7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.436299 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.436791 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrxdj\" (UniqueName: \"kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj\") pod \"collect-profiles-29323380-4xtkb\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.460832 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkr9l\" (UniqueName: \"kubernetes.io/projected/285b3242-b0f5-47cc-8f8c-5c580bf7b31e-kube-api-access-mkr9l\") pod \"cluster-samples-operator-665b6dd947-5t8ss\" (UID: \"285b3242-b0f5-47cc-8f8c-5c580bf7b31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.474985 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h97s\" (UniqueName: \"kubernetes.io/projected/edb0db23-72e4-4288-b8e6-84df2e5091b3-kube-api-access-5h97s\") pod \"olm-operator-6b444d44fb-tbxkj\" (UID: \"edb0db23-72e4-4288-b8e6-84df2e5091b3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.491334 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qckgc\" (UniqueName: \"kubernetes.io/projected/062edadc-90e2-4fe6-9073-d8a32ba6b345-kube-api-access-qckgc\") pod \"console-operator-58897d9998-gpcbf\" (UID: \"062edadc-90e2-4fe6-9073-d8a32ba6b345\") " pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.514366 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.520372 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.522444 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.537486 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.544990 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qxkdv"] Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.545317 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.553486 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fcp5\" (UniqueName: \"kubernetes.io/projected/72c7ae81-8e0d-4b7e-9c26-8b35dac082b3-kube-api-access-8fcp5\") pod \"authentication-operator-69f744f599-jxdhx\" (UID: \"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.556108 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.568102 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.574760 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.582118 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.591286 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.593511 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.600687 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.611212 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.613293 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.621155 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.635485 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.638381 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.643312 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:31 crc kubenswrapper[4929]: W1002 11:12:31.643885 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854dc2ee_5769_484e_a9a4_69a592dcaac1.slice/crio-5ef4c4abdb03d75c9ed71dcfcdd9703170b45af20d0a3d1008d201905d040dcd WatchSource:0}: Error finding container 5ef4c4abdb03d75c9ed71dcfcdd9703170b45af20d0a3d1008d201905d040dcd: Status 404 returned error can't find the container with id 5ef4c4abdb03d75c9ed71dcfcdd9703170b45af20d0a3d1008d201905d040dcd Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.653463 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.662602 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wxj9k"] Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.673987 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.696820 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.721789 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh"] Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.739543 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.748238 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7wfcz\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.776882 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.782394 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-6xvv6"] Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825424 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmbks\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-kube-api-access-rmbks\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825471 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5hr\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825497 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ddbe1a2-1d8a-4e89-8284-4664a211f968-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825530 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825575 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-srv-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825606 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd451cea-3090-4483-9141-a2f45ece2ce2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825621 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825637 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/029e08ec-dabe-4a49-8ec6-b1b22999b713-machine-approver-tls\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825676 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ddbe1a2-1d8a-4e89-8284-4664a211f968-config\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825690 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-proxy-tls\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825706 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-auth-proxy-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825741 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtjk\" (UniqueName: \"kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825755 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sr82\" (UniqueName: \"kubernetes.io/projected/029e08ec-dabe-4a49-8ec6-b1b22999b713-kube-api-access-6sr82\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825770 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825793 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd451cea-3090-4483-9141-a2f45ece2ce2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825826 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825923 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825939 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd451cea-3090-4483-9141-a2f45ece2ce2-config\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.825984 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg27w\" (UniqueName: \"kubernetes.io/projected/59cbed61-dc5e-4490-a1e5-45d12c65bf48-kube-api-access-kg27w\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826052 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-profile-collector-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826086 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826151 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826181 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ee1580-a680-421c-8e39-edd438584800-metrics-tls\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826256 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826314 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826369 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-images\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826439 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826456 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhftp\" (UniqueName: \"kubernetes.io/projected/76d63ecb-462e-4002-8d76-b083b360a907-kube-api-access-nhftp\") pod \"migrator-59844c95c7-7r78f\" (UID: \"76d63ecb-462e-4002-8d76-b083b360a907\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826526 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826574 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ee1580-a680-421c-8e39-edd438584800-trusted-ca\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826589 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ddbe1a2-1d8a-4e89-8284-4664a211f968-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826633 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.826668 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjxx\" (UniqueName: \"kubernetes.io/projected/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-kube-api-access-gtjxx\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: E1002 11:12:31.833377 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.333361647 +0000 UTC m=+152.883728011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.920103 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927356 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927516 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927542 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e5719-e329-4073-9ef4-6c26073399f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927562 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-registration-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927580 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87dt\" (UniqueName: \"kubernetes.io/projected/b9794854-0c90-4d9a-a57f-3daf45d35d8c-kube-api-access-v87dt\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927600 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd451cea-3090-4483-9141-a2f45ece2ce2-config\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927650 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg27w\" (UniqueName: \"kubernetes.io/projected/59cbed61-dc5e-4490-a1e5-45d12c65bf48-kube-api-access-kg27w\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927668 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp7wv\" (UniqueName: \"kubernetes.io/projected/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-kube-api-access-pp7wv\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927707 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-profile-collector-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927724 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927739 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-metrics-tls\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c43f5d6e-be64-4291-bd66-2548210bc566-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927774 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkq9\" (UniqueName: \"kubernetes.io/projected/ce2732c9-d9ca-4f15-a43c-a92186f698d8-kube-api-access-fqkq9\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927795 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927822 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kll4q\" (UniqueName: \"kubernetes.io/projected/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-kube-api-access-kll4q\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927842 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ee1580-a680-421c-8e39-edd438584800-metrics-tls\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927869 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a44609-f273-4180-b294-1362acf3d0af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927887 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-plugins-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927907 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927935 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.927968 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-serving-cert\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928073 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-images\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928093 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk2kz\" (UniqueName: \"kubernetes.io/projected/c43f5d6e-be64-4291-bd66-2548210bc566-kube-api-access-nk2kz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928117 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-node-bootstrap-token\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928147 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcl4s\" (UniqueName: \"kubernetes.io/projected/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-kube-api-access-bcl4s\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928166 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhftp\" (UniqueName: \"kubernetes.io/projected/76d63ecb-462e-4002-8d76-b083b360a907-kube-api-access-nhftp\") pod \"migrator-59844c95c7-7r78f\" (UID: \"76d63ecb-462e-4002-8d76-b083b360a907\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928183 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928201 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e7e5719-e329-4073-9ef4-6c26073399f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928218 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ee1580-a680-421c-8e39-edd438584800-trusted-ca\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928233 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ddbe1a2-1d8a-4e89-8284-4664a211f968-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928250 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-certs\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928401 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwjf\" (UniqueName: \"kubernetes.io/projected/0e1b7310-1ee8-4077-b873-a6a54f445381-kube-api-access-bmwjf\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e5719-e329-4073-9ef4-6c26073399f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928507 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjxx\" (UniqueName: \"kubernetes.io/projected/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-kube-api-access-gtjxx\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928592 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-csi-data-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928639 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1b7310-1ee8-4077-b873-a6a54f445381-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928661 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-webhook-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928676 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce2732c9-d9ca-4f15-a43c-a92186f698d8-cert\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928690 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-config\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928705 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a44609-f273-4180-b294-1362acf3d0af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928726 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmbks\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-kube-api-access-rmbks\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928743 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5hr\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928759 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ddbe1a2-1d8a-4e89-8284-4664a211f968-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg482\" (UniqueName: \"kubernetes.io/projected/e4a44609-f273-4180-b294-1362acf3d0af-kube-api-access-cg482\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928794 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9118fbfd-e206-451a-b873-7041e07207a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928834 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-config-volume\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928850 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-srv-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928864 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd451cea-3090-4483-9141-a2f45ece2ce2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928879 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928895 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/029e08ec-dabe-4a49-8ec6-b1b22999b713-machine-approver-tls\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928912 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-tmpfs\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928916 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd451cea-3090-4483-9141-a2f45ece2ce2-config\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928930 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgkf\" (UniqueName: \"kubernetes.io/projected/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-kube-api-access-2hgkf\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.928951 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-proxy-tls\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929004 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ddbe1a2-1d8a-4e89-8284-4664a211f968-config\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929020 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-mountpoint-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929043 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-auth-proxy-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929062 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtjk\" (UniqueName: \"kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929079 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sr82\" (UniqueName: \"kubernetes.io/projected/029e08ec-dabe-4a49-8ec6-b1b22999b713-kube-api-access-6sr82\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929095 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-apiservice-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929111 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-socket-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929127 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929148 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd451cea-3090-4483-9141-a2f45ece2ce2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929167 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78ff\" (UniqueName: \"kubernetes.io/projected/9118fbfd-e206-451a-b873-7041e07207a4-kube-api-access-l78ff\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.929589 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: E1002 11:12:31.930212 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.430195887 +0000 UTC m=+152.980562251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.930327 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.930602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.931618 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-images\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.932519 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.936319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-profile-collector-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.937192 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.937802 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/029e08ec-dabe-4a49-8ec6-b1b22999b713-auth-proxy-config\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.937920 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/029e08ec-dabe-4a49-8ec6-b1b22999b713-machine-approver-tls\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.937983 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ddbe1a2-1d8a-4e89-8284-4664a211f968-config\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.938154 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.938927 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.953926 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00ee1580-a680-421c-8e39-edd438584800-trusted-ca\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.954405 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.954904 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-proxy-tls\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.955430 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd451cea-3090-4483-9141-a2f45ece2ce2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.956040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59cbed61-dc5e-4490-a1e5-45d12c65bf48-srv-cert\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.965211 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qxjwx" event={"ID":"a34747b6-0f96-47d6-a12b-7fb4c40b5bab","Type":"ContainerStarted","Data":"739cc4a64f8306cdaafefaf18a76909d34ce3daf78d88fa9d4a94d8625fca1ac"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.965256 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qxjwx" event={"ID":"a34747b6-0f96-47d6-a12b-7fb4c40b5bab","Type":"ContainerStarted","Data":"4f2abf35afbff61adb94822a5827aabeec9b2153452bd48de02c4d9662eb466a"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.968056 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" event={"ID":"04489d51-a83f-4152-ac0c-d5a46898e21a","Type":"ContainerStarted","Data":"1b91760307d2c60b1293a5bd8940d1d58e2a552a67ade812e239c255386c7b9a"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.974919 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" event={"ID":"1f604268-ce5c-4822-be72-1fe5615cf7bc","Type":"ContainerStarted","Data":"d41d07d8b34e81393cbb6332bf47f44ac9e5210629b477a5eb7695a2d7742ba3"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.975930 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qxkdv" event={"ID":"854dc2ee-5769-484e-a9a4-69a592dcaac1","Type":"ContainerStarted","Data":"5ef4c4abdb03d75c9ed71dcfcdd9703170b45af20d0a3d1008d201905d040dcd"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.976093 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ddbe1a2-1d8a-4e89-8284-4664a211f968-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.979915 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.980052 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" event={"ID":"c24e4fe9-f50e-4989-9316-e17e05b64acc","Type":"ContainerStarted","Data":"97d939352375cdc054dff706ff2cc5331ba7510d695243c7c62f423a28495d65"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.980079 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" event={"ID":"c24e4fe9-f50e-4989-9316-e17e05b64acc","Type":"ContainerStarted","Data":"c5d97f4c02bcec19d15bf3da52afee558b33664c5f1b15d558f75a94625e890e"} Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.987306 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg27w\" (UniqueName: \"kubernetes.io/projected/59cbed61-dc5e-4490-a1e5-45d12c65bf48-kube-api-access-kg27w\") pod \"catalog-operator-68c6474976-sp9ft\" (UID: \"59cbed61-dc5e-4490-a1e5-45d12c65bf48\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.988835 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:31 crc kubenswrapper[4929]: I1002 11:12:31.999520 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" event={"ID":"aa6fb363-0b33-4217-82df-61e525432168","Type":"ContainerStarted","Data":"1b8fb02ef88fed3ab98869d97146be4cac9a461b9ad3c967b89222e547bb5cc1"} Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.013734 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmbks\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-kube-api-access-rmbks\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.020928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00ee1580-a680-421c-8e39-edd438584800-metrics-tls\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.027704 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" event={"ID":"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e","Type":"ContainerStarted","Data":"d39239a16544a6b9b85b61c195a248356efc707969685df0631487716df5fa37"} Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.029904 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kll4q\" (UniqueName: \"kubernetes.io/projected/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-kube-api-access-kll4q\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.029941 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a44609-f273-4180-b294-1362acf3d0af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.029978 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-plugins-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-serving-cert\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030033 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk2kz\" (UniqueName: \"kubernetes.io/projected/c43f5d6e-be64-4291-bd66-2548210bc566-kube-api-access-nk2kz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030052 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030068 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcl4s\" (UniqueName: \"kubernetes.io/projected/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-kube-api-access-bcl4s\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030089 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-node-bootstrap-token\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030105 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e7e5719-e329-4073-9ef4-6c26073399f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030121 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-certs\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030142 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwjf\" (UniqueName: \"kubernetes.io/projected/0e1b7310-1ee8-4077-b873-a6a54f445381-kube-api-access-bmwjf\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030158 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e5719-e329-4073-9ef4-6c26073399f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030179 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-csi-data-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030194 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1b7310-1ee8-4077-b873-a6a54f445381-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030214 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-webhook-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-config\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030245 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a44609-f273-4180-b294-1362acf3d0af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030265 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce2732c9-d9ca-4f15-a43c-a92186f698d8-cert\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030295 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg482\" (UniqueName: \"kubernetes.io/projected/e4a44609-f273-4180-b294-1362acf3d0af-kube-api-access-cg482\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030313 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9118fbfd-e206-451a-b873-7041e07207a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030332 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-config-volume\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030357 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgkf\" (UniqueName: \"kubernetes.io/projected/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-kube-api-access-2hgkf\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030373 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-tmpfs\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030394 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-mountpoint-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030421 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-apiservice-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030437 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-socket-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030460 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l78ff\" (UniqueName: \"kubernetes.io/projected/9118fbfd-e206-451a-b873-7041e07207a4-kube-api-access-l78ff\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030477 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e5719-e329-4073-9ef4-6c26073399f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030494 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-registration-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030514 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v87dt\" (UniqueName: \"kubernetes.io/projected/b9794854-0c90-4d9a-a57f-3daf45d35d8c-kube-api-access-v87dt\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030533 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp7wv\" (UniqueName: \"kubernetes.io/projected/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-kube-api-access-pp7wv\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030557 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-metrics-tls\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030561 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjxx\" (UniqueName: \"kubernetes.io/projected/2a8296b7-a5ae-4091-a32b-0fe704cbc3f9-kube-api-access-gtjxx\") pod \"machine-config-operator-74547568cd-fvqtg\" (UID: \"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030575 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c43f5d6e-be64-4291-bd66-2548210bc566-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.030671 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkq9\" (UniqueName: \"kubernetes.io/projected/ce2732c9-d9ca-4f15-a43c-a92186f698d8-kube-api-access-fqkq9\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.031537 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-config\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.032178 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-mountpoint-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.035838 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c43f5d6e-be64-4291-bd66-2548210bc566-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.035894 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-socket-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.035915 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-plugins-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.035974 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-registration-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.036907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-config-volume\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.037197 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.037443 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5hr\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.037489 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.537473553 +0000 UTC m=+153.087839917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.038257 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-tmpfs\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.038738 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e7e5719-e329-4073-9ef4-6c26073399f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.040267 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a44609-f273-4180-b294-1362acf3d0af-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.041215 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b9794854-0c90-4d9a-a57f-3daf45d35d8c-csi-data-dir\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.042806 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1b7310-1ee8-4077-b873-a6a54f445381-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.044105 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-webhook-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.047059 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-node-bootstrap-token\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.056662 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce2732c9-d9ca-4f15-a43c-a92186f698d8-cert\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.060815 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gpcbf"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.062362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-apiservice-cert\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.064412 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9118fbfd-e206-451a-b873-7041e07207a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.064628 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e7e5719-e329-4073-9ef4-6c26073399f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.078102 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-serving-cert\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.078107 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-metrics-tls\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.078461 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-certs\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.087569 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ddbe1a2-1d8a-4e89-8284-4664a211f968-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5nc7z\" (UID: \"4ddbe1a2-1d8a-4e89-8284-4664a211f968\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.090676 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a44609-f273-4180-b294-1362acf3d0af-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.094227 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.097640 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhftp\" (UniqueName: \"kubernetes.io/projected/76d63ecb-462e-4002-8d76-b083b360a907-kube-api-access-nhftp\") pod \"migrator-59844c95c7-7r78f\" (UID: \"76d63ecb-462e-4002-8d76-b083b360a907\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.105097 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.113304 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/00ee1580-a680-421c-8e39-edd438584800-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6q5rh\" (UID: \"00ee1580-a680-421c-8e39-edd438584800\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.132704 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.134058 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.634021493 +0000 UTC m=+153.184388027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.136093 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sr82\" (UniqueName: \"kubernetes.io/projected/029e08ec-dabe-4a49-8ec6-b1b22999b713-kube-api-access-6sr82\") pod \"machine-approver-56656f9798-4c4v9\" (UID: \"029e08ec-dabe-4a49-8ec6-b1b22999b713\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.153750 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.155707 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtjk\" (UniqueName: \"kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk\") pod \"marketplace-operator-79b997595-h4pzk\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.177396 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd451cea-3090-4483-9141-a2f45ece2ce2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-sdssd\" (UID: \"cd451cea-3090-4483-9141-a2f45ece2ce2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.204030 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.226186 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkq9\" (UniqueName: \"kubernetes.io/projected/ce2732c9-d9ca-4f15-a43c-a92186f698d8-kube-api-access-fqkq9\") pod \"ingress-canary-zjjls\" (UID: \"ce2732c9-d9ca-4f15-a43c-a92186f698d8\") " pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:32 crc kubenswrapper[4929]: W1002 11:12:32.231866 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab99922_e8b9_4c89_b9a0_ad9f1cced2e5.slice/crio-d5f599454d730b46d4692b47e43eac95da47d53b69e3d21693bd71a756014479 WatchSource:0}: Error finding container d5f599454d730b46d4692b47e43eac95da47d53b69e3d21693bd71a756014479: Status 404 returned error can't find the container with id d5f599454d730b46d4692b47e43eac95da47d53b69e3d21693bd71a756014479 Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.234977 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.235425 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.735411379 +0000 UTC m=+153.285777743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.239798 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kll4q\" (UniqueName: \"kubernetes.io/projected/bb86ef9b-bea0-4fb7-b11b-804e84be19cf-kube-api-access-kll4q\") pod \"service-ca-operator-777779d784-lrrcl\" (UID: \"bb86ef9b-bea0-4fb7-b11b-804e84be19cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.252684 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3e7e5719-e329-4073-9ef4-6c26073399f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ln27l\" (UID: \"3e7e5719-e329-4073-9ef4-6c26073399f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.270615 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.271200 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp7wv\" (UniqueName: \"kubernetes.io/projected/380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba-kube-api-access-pp7wv\") pod \"machine-config-server-sqw5b\" (UID: \"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba\") " pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.281809 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.294988 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.298551 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87dt\" (UniqueName: \"kubernetes.io/projected/b9794854-0c90-4d9a-a57f-3daf45d35d8c-kube-api-access-v87dt\") pod \"csi-hostpathplugin-kw7jx\" (UID: \"b9794854-0c90-4d9a-a57f-3daf45d35d8c\") " pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.303659 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.309880 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.315263 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jxdhx"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.318720 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.321575 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgkf\" (UniqueName: \"kubernetes.io/projected/ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4-kube-api-access-2hgkf\") pod \"packageserver-d55dfcdfc-q6pdb\" (UID: \"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.324022 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.326081 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.328335 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.330929 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.335633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78ff\" (UniqueName: \"kubernetes.io/projected/9118fbfd-e206-451a-b873-7041e07207a4-kube-api-access-l78ff\") pod \"multus-admission-controller-857f4d67dd-pxt8j\" (UID: \"9118fbfd-e206-451a-b873-7041e07207a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.336107 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.336596 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.836567467 +0000 UTC m=+153.386933831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.338300 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" Oct 02 11:12:32 crc kubenswrapper[4929]: W1002 11:12:32.343498 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c7ae81_8e0d_4b7e_9c26_8b35dac082b3.slice/crio-787a5cb8a95d3ef1d5b92b6a7e5f579972e97a6aae306c0defa14c60937f09b5 WatchSource:0}: Error finding container 787a5cb8a95d3ef1d5b92b6a7e5f579972e97a6aae306c0defa14c60937f09b5: Status 404 returned error can't find the container with id 787a5cb8a95d3ef1d5b92b6a7e5f579972e97a6aae306c0defa14c60937f09b5 Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.350729 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwjf\" (UniqueName: \"kubernetes.io/projected/0e1b7310-1ee8-4077-b873-a6a54f445381-kube-api-access-bmwjf\") pod \"package-server-manager-789f6589d5-hnftv\" (UID: \"0e1b7310-1ee8-4077-b873-a6a54f445381\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.353110 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.359075 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.364308 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.366686 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.367206 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.367258 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.383936 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk2kz\" (UniqueName: \"kubernetes.io/projected/c43f5d6e-be64-4291-bd66-2548210bc566-kube-api-access-nk2kz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rch4g\" (UID: \"c43f5d6e-be64-4291-bd66-2548210bc566\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.401003 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcl4s\" (UniqueName: \"kubernetes.io/projected/651a0356-18c1-46e1-a4c4-a4c409cd3a1e-kube-api-access-bcl4s\") pod \"dns-default-4sh99\" (UID: \"651a0356-18c1-46e1-a4c4-a4c409cd3a1e\") " pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.424702 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.430032 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjjls" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.430788 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.434060 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg482\" (UniqueName: \"kubernetes.io/projected/e4a44609-f273-4180-b294-1362acf3d0af-kube-api-access-cg482\") pod \"kube-storage-version-migrator-operator-b67b599dd-rbgn4\" (UID: \"e4a44609-f273-4180-b294-1362acf3d0af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.439105 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.439407 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:32.939393461 +0000 UTC m=+153.489759825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.442366 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-sqw5b" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.449469 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.459310 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqrd9"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.487143 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4hh6l"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.508853 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-z4jvz" podStartSLOduration=128.508838175 podStartE2EDuration="2m8.508838175s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:32.508231024 +0000 UTC m=+153.058597388" watchObservedRunningTime="2025-10-02 11:12:32.508838175 +0000 UTC m=+153.059204539" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.511287 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.512699 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.540082 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.540372 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.040345092 +0000 UTC m=+153.590711456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.540599 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.540940 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.040930422 +0000 UTC m=+153.591296786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.605414 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.615717 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft"] Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.641401 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.641673 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.141642605 +0000 UTC m=+153.692008969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.641999 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.642323 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.142310097 +0000 UTC m=+153.692676461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.646031 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.681293 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.743833 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.744240 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.244220381 +0000 UTC m=+153.794586745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: W1002 11:12:32.797115 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923649eb_ddfc_4299_b94e_3f549a863233.slice/crio-ae8df8b7e7a1e7dacb97f1be2c880ce214b109a10505a76dca6de6c9a99df242 WatchSource:0}: Error finding container ae8df8b7e7a1e7dacb97f1be2c880ce214b109a10505a76dca6de6c9a99df242: Status 404 returned error can't find the container with id ae8df8b7e7a1e7dacb97f1be2c880ce214b109a10505a76dca6de6c9a99df242 Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.850127 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.850434 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.350422171 +0000 UTC m=+153.900788535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.952732 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.952904 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.452875372 +0000 UTC m=+154.003241736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:32 crc kubenswrapper[4929]: I1002 11:12:32.953286 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:32 crc kubenswrapper[4929]: E1002 11:12:32.954069 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.454052472 +0000 UTC m=+154.004418846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.012126 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.029843 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.036328 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc6nf" event={"ID":"c446dd7b-73fd-4b60-91d9-f1b74df3b69a","Type":"ContainerStarted","Data":"d3d589d05f837e02e76f3ccfc0dd0218e32d25fab44feb970b3adb01773660b3"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.038117 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.039023 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" event={"ID":"313a7d22-f379-4a34-8797-77a77c0ddc97","Type":"ContainerStarted","Data":"5c2b3ee145145c3cde8933e1a4ffac1596e008e82a0711fe0213ed5ed0feeaa3"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.044240 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" event={"ID":"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3","Type":"ContainerStarted","Data":"eba50d0da5f42da4a1a13e063073932d4c506b80c1affe2d172b0b15d8b4e63e"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.044273 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" event={"ID":"72c7ae81-8e0d-4b7e-9c26-8b35dac082b3","Type":"ContainerStarted","Data":"787a5cb8a95d3ef1d5b92b6a7e5f579972e97a6aae306c0defa14c60937f09b5"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.058312 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.058616 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.558598896 +0000 UTC m=+154.108965260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.115937 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" event={"ID":"04489d51-a83f-4152-ac0c-d5a46898e21a","Type":"ContainerStarted","Data":"8c5a35265930c75291a81e919e5e90fa704f1ba411cc6b360357dd736fa6fe4e"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.116200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" event={"ID":"04489d51-a83f-4152-ac0c-d5a46898e21a","Type":"ContainerStarted","Data":"f391079b6ba26db5b6a7d6057f9ee068353f62665802691e90ea6a51cf0d6958"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.134245 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.148546 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.161576 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.161828 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" event={"ID":"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5","Type":"ContainerStarted","Data":"7731dd6e1b1d2af1af4fc52179e8d8a4bd3a4bdd9c5df7f56558cd394862d6ef"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.161880 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" event={"ID":"6ab99922-e8b9-4c89-b9a0-ad9f1cced2e5","Type":"ContainerStarted","Data":"d5f599454d730b46d4692b47e43eac95da47d53b69e3d21693bd71a756014479"} Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.163007 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.662992234 +0000 UTC m=+154.213358598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.168157 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" event={"ID":"b4809e56-3d51-4508-954c-df41db145ee7","Type":"ContainerStarted","Data":"3e8514a1457c9b504e268ba2f1b439916dfc9c85b3e5d158b532f7bf408e399b"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.169660 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" event={"ID":"062edadc-90e2-4fe6-9073-d8a32ba6b345","Type":"ContainerStarted","Data":"8700dbab02d16ca177288e0b5b785cc11586aee0cc4b98cec896e2ef54881dac"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.169683 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" event={"ID":"062edadc-90e2-4fe6-9073-d8a32ba6b345","Type":"ContainerStarted","Data":"8029d7b79c4bc29feb5be4a26bfc6e6820eef29ffbe3599d7d7b4ff8612b5190"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.170368 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.171631 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qxkdv" event={"ID":"854dc2ee-5769-484e-a9a4-69a592dcaac1","Type":"ContainerStarted","Data":"d364f37f1750090d302204163b1ce824c2ddcbcaab56476c0bdd3bbd2546c98e"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.172148 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.173804 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" event={"ID":"59cbed61-dc5e-4490-a1e5-45d12c65bf48","Type":"ContainerStarted","Data":"61feebf598541098657f52fb490002980782cebd3e3f7f7f8bef0f9b6d34789b"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.178645 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" event={"ID":"ac426e70-652b-44e7-83ff-3a6c26942921","Type":"ContainerStarted","Data":"2ebc5efb2eaf731850fee5d36e3b5e30134102dfdd59a8c60a0e22cedc6702f5"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.184918 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" event={"ID":"285b3242-b0f5-47cc-8f8c-5c580bf7b31e","Type":"ContainerStarted","Data":"bec7bfcfb7c213ba2a3ddb9c5adb1244ac7e3da619c192947c658bac552291c3"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.184982 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" event={"ID":"285b3242-b0f5-47cc-8f8c-5c580bf7b31e","Type":"ContainerStarted","Data":"fdd9aa572001fad5d11a13f7ef1874e4b7c38d9aa53b7ab13deb0cb0e72d6b53"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.186850 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" event={"ID":"d6016b7e-1424-416e-bcd9-a3490eec1493","Type":"ContainerStarted","Data":"6e724243c878b90d72458ddc6885f6b89355d9ccc46a335e231c663623148a54"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.192284 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" event={"ID":"aa6fb363-0b33-4217-82df-61e525432168","Type":"ContainerStarted","Data":"0a024d699e10e63e77a012e755b4eb922a7dec5885e2e3561dfe70d99d823fd0"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.192319 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" event={"ID":"aa6fb363-0b33-4217-82df-61e525432168","Type":"ContainerStarted","Data":"c197365f678b3be7cad69e2b23ae04fb41344f27bf280a2b0c02b6b1f56ccb4c"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.195239 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" event={"ID":"cc152222-074c-4883-bb5f-f0a836a96023","Type":"ContainerStarted","Data":"1c08c15156c4badbb0b11d3ced9473f2f0a6d8ba21edd3adfbaff1fa5c064ebb"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.197168 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sqw5b" event={"ID":"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba","Type":"ContainerStarted","Data":"7e7518bd19f161347c65c4a877e69f506ebc81efa0ec1f0ce25cbad92beefd18"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.202815 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" event={"ID":"edb0db23-72e4-4288-b8e6-84df2e5091b3","Type":"ContainerStarted","Data":"f3df7af493c966028e4eacdcc0d4971eacd8b297f9f373ad27fd134d2c9507a3"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.206242 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" event={"ID":"029e08ec-dabe-4a49-8ec6-b1b22999b713","Type":"ContainerStarted","Data":"7d275977698b1a7c84a316ba1517d4f1290baf431692a2f1ead13ccb83fb93ee"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.207848 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" event={"ID":"923649eb-ddfc-4299-b94e-3f549a863233","Type":"ContainerStarted","Data":"ae8df8b7e7a1e7dacb97f1be2c880ce214b109a10505a76dca6de6c9a99df242"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.235901 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" event={"ID":"d40ec5d0-2bc5-451c-a843-3f9c3bd0bf6e","Type":"ContainerStarted","Data":"1db3e3d708d536c06e5b95538ee7ef16b6c067a16203a83ea0cdd314f37553eb"} Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.266430 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.271990 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.771969689 +0000 UTC m=+154.322336053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.285126 4929 patch_prober.go:28] interesting pod/downloads-7954f5f757-qxkdv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.289982 4929 patch_prober.go:28] interesting pod/console-operator-58897d9998-gpcbf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.290037 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" podUID="062edadc-90e2-4fe6-9073-d8a32ba6b345" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.285501 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qxkdv" podUID="854dc2ee-5769-484e-a9a4-69a592dcaac1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.349353 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" podStartSLOduration=128.349336773 podStartE2EDuration="2m8.349336773s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:33.347823112 +0000 UTC m=+153.898189476" watchObservedRunningTime="2025-10-02 11:12:33.349336773 +0000 UTC m=+153.899703137" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.370541 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.374281 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.874261985 +0000 UTC m=+154.424628429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.383973 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:33 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:33 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:33 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.384031 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.449200 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.475503 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.475995 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:33.975974582 +0000 UTC m=+154.526340946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.510209 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.523793 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb"] Oct 02 11:12:33 crc kubenswrapper[4929]: W1002 11:12:33.526206 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ee1580_a680_421c_8e39_edd438584800.slice/crio-525afa52163e80c883c47f477eb3732e25ffaf3acaffefe2d1f1f126f577cb16 WatchSource:0}: Error finding container 525afa52163e80c883c47f477eb3732e25ffaf3acaffefe2d1f1f126f577cb16: Status 404 returned error can't find the container with id 525afa52163e80c883c47f477eb3732e25ffaf3acaffefe2d1f1f126f577cb16 Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.577941 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.578259 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.078247828 +0000 UTC m=+154.628614192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.601457 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.614675 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.626473 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4sh99"] Oct 02 11:12:33 crc kubenswrapper[4929]: W1002 11:12:33.648361 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1b7310_1ee8_4077_b873_a6a54f445381.slice/crio-b4ed1f0cdff69387f6cd133c3cbc0e35445ea4f181025c491c371d6f80ec3b04 WatchSource:0}: Error finding container b4ed1f0cdff69387f6cd133c3cbc0e35445ea4f181025c491c371d6f80ec3b04: Status 404 returned error can't find the container with id b4ed1f0cdff69387f6cd133c3cbc0e35445ea4f181025c491c371d6f80ec3b04 Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.665863 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjjls"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.684815 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.685441 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.185420261 +0000 UTC m=+154.735786625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.695092 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pxt8j"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.712895 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kw7jx"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.713711 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" podStartSLOduration=129.713685477 podStartE2EDuration="2m9.713685477s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:33.712692883 +0000 UTC m=+154.263059247" watchObservedRunningTime="2025-10-02 11:12:33.713685477 +0000 UTC m=+154.264051841" Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.787789 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.788255 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.288239946 +0000 UTC m=+154.838606310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.889069 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.889423 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.389404593 +0000 UTC m=+154.939770957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.889514 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.889939 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.389919211 +0000 UTC m=+154.940285575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.944358 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.956978 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g"] Oct 02 11:12:33 crc kubenswrapper[4929]: I1002 11:12:33.993048 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:33 crc kubenswrapper[4929]: E1002 11:12:33.993425 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.493403578 +0000 UTC m=+155.043769942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.050322 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jxdhx" podStartSLOduration=130.050304263 podStartE2EDuration="2m10.050304263s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:33.99315481 +0000 UTC m=+154.543521174" watchObservedRunningTime="2025-10-02 11:12:34.050304263 +0000 UTC m=+154.600670627" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.053439 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" podStartSLOduration=129.05342759 podStartE2EDuration="2m9.05342759s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.049528907 +0000 UTC m=+154.599895271" watchObservedRunningTime="2025-10-02 11:12:34.05342759 +0000 UTC m=+154.603793944" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.094157 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.094475 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.594458222 +0000 UTC m=+155.144824586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.123605 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wxj9k" podStartSLOduration=129.123590158 podStartE2EDuration="2m9.123590158s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.117744958 +0000 UTC m=+154.668111342" watchObservedRunningTime="2025-10-02 11:12:34.123590158 +0000 UTC m=+154.673956522" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.193018 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" podStartSLOduration=130.192996431 podStartE2EDuration="2m10.192996431s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.153134838 +0000 UTC m=+154.703501222" watchObservedRunningTime="2025-10-02 11:12:34.192996431 +0000 UTC m=+154.743362795" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.196267 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.196674 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.696656146 +0000 UTC m=+155.247022520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.242385 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" podStartSLOduration=130.242367438 podStartE2EDuration="2m10.242367438s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.241065154 +0000 UTC m=+154.791431528" watchObservedRunningTime="2025-10-02 11:12:34.242367438 +0000 UTC m=+154.792733802" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.243071 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rhnwv" podStartSLOduration=130.243063782 podStartE2EDuration="2m10.243063782s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.195541778 +0000 UTC m=+154.745908142" watchObservedRunningTime="2025-10-02 11:12:34.243063782 +0000 UTC m=+154.793430146" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.292995 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" event={"ID":"285b3242-b0f5-47cc-8f8c-5c580bf7b31e","Type":"ContainerStarted","Data":"fcc9a113f620cbdb25f71284de05136bd2c3f4e2c2975232946e0aa77f4614c1"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.298192 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.298708 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.798688393 +0000 UTC m=+155.349054827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.303589 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ncnkh" podStartSLOduration=130.30357064 podStartE2EDuration="2m10.30357064s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.302373849 +0000 UTC m=+154.852740223" watchObservedRunningTime="2025-10-02 11:12:34.30357064 +0000 UTC m=+154.853937004" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.321234 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" event={"ID":"923649eb-ddfc-4299-b94e-3f549a863233","Type":"ContainerStarted","Data":"5db901c0d884a853b742f6c27c662acf331b4d85c88bc14e8c1301990d04efe8"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.321863 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.324619 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc6nf" event={"ID":"c446dd7b-73fd-4b60-91d9-f1b74df3b69a","Type":"ContainerStarted","Data":"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.330788 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qxkdv" podStartSLOduration=130.33077063 podStartE2EDuration="2m10.33077063s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.324403722 +0000 UTC m=+154.874770086" watchObservedRunningTime="2025-10-02 11:12:34.33077063 +0000 UTC m=+154.881136994" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.337133 4929 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7wfcz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.337181 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.341538 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" event={"ID":"59cbed61-dc5e-4490-a1e5-45d12c65bf48","Type":"ContainerStarted","Data":"e834ea46d7e90dfdd2746a78baea515dd3c5cda56e79ad8247f5166aef599ac6"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.342241 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.352122 4929 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-sp9ft container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.352189 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" podUID="59cbed61-dc5e-4490-a1e5-45d12c65bf48" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.363029 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-6xvv6" podStartSLOduration=130.363010602 podStartE2EDuration="2m10.363010602s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.362691201 +0000 UTC m=+154.913057565" watchObservedRunningTime="2025-10-02 11:12:34.363010602 +0000 UTC m=+154.913376966" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.371889 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:34 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:34 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:34 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.372257 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.417116 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" event={"ID":"9118fbfd-e206-451a-b873-7041e07207a4","Type":"ContainerStarted","Data":"7fd4dcb7f0ad677991a0f8c9548fad48404a5c0ae89e607ce3ceda54c86fe4d9"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.434749 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.436647 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.936629098 +0000 UTC m=+155.486995462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.448561 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" event={"ID":"e4a44609-f273-4180-b294-1362acf3d0af","Type":"ContainerStarted","Data":"e6bdc8fa48314a5433e8120dca7c0d01607658b02df38cc95869eff6677481d7"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.449414 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qxjwx" podStartSLOduration=130.449402925 podStartE2EDuration="2m10.449402925s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.416695137 +0000 UTC m=+154.967061491" watchObservedRunningTime="2025-10-02 11:12:34.449402925 +0000 UTC m=+154.999769289" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.449782 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" event={"ID":"bb86ef9b-bea0-4fb7-b11b-804e84be19cf","Type":"ContainerStarted","Data":"d9495dabc077a13892088cf0ba3983d23eab98a33f57f28f5c938bb7cebc6219"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.457476 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" podStartSLOduration=129.45745886 podStartE2EDuration="2m9.45745886s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.447596573 +0000 UTC m=+154.997962937" watchObservedRunningTime="2025-10-02 11:12:34.45745886 +0000 UTC m=+155.007825224" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.464536 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" event={"ID":"d6016b7e-1424-416e-bcd9-a3490eec1493","Type":"ContainerStarted","Data":"4604224c97c441a5310511cf3aacb3d3f9350ab99d9763cb98dbce6ec9a1be05"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.478357 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" podStartSLOduration=130.478338804 podStartE2EDuration="2m10.478338804s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.477552627 +0000 UTC m=+155.027919001" watchObservedRunningTime="2025-10-02 11:12:34.478338804 +0000 UTC m=+155.028705168" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.491108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" event={"ID":"00ee1580-a680-421c-8e39-edd438584800","Type":"ContainerStarted","Data":"1c1bef3e722058e18079bbcf85bf220ea70edb403d1585838ee6cc6ed3fa956d"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.491180 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" event={"ID":"00ee1580-a680-421c-8e39-edd438584800","Type":"ContainerStarted","Data":"525afa52163e80c883c47f477eb3732e25ffaf3acaffefe2d1f1f126f577cb16"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.510931 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" event={"ID":"029e08ec-dabe-4a49-8ec6-b1b22999b713","Type":"ContainerStarted","Data":"d5b02033ce8bacf1e651a250f1effed57facd46c8b4a7cab43adf7957d669250"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.517542 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zc6nf" podStartSLOduration=130.517526233 podStartE2EDuration="2m10.517526233s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.516194848 +0000 UTC m=+155.066561212" watchObservedRunningTime="2025-10-02 11:12:34.517526233 +0000 UTC m=+155.067892597" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.524857 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" event={"ID":"4ddbe1a2-1d8a-4e89-8284-4664a211f968","Type":"ContainerStarted","Data":"7f3f314875b19a4ebbb781059a360e7c9239ee629fd9354a4ed81f45f441645f"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.524895 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" event={"ID":"4ddbe1a2-1d8a-4e89-8284-4664a211f968","Type":"ContainerStarted","Data":"8a546e51031ff71f29f93eaadb7680db61604f385fb79390d8e3459b4654f26d"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.535717 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.537820 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.037805987 +0000 UTC m=+155.588172351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.545946 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5t8ss" podStartSLOduration=130.545932994 podStartE2EDuration="2m10.545932994s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.544015479 +0000 UTC m=+155.094381843" watchObservedRunningTime="2025-10-02 11:12:34.545932994 +0000 UTC m=+155.096299358" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.589187 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" podStartSLOduration=129.589171332 podStartE2EDuration="2m9.589171332s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.588353944 +0000 UTC m=+155.138720308" watchObservedRunningTime="2025-10-02 11:12:34.589171332 +0000 UTC m=+155.139537696" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.617875 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" event={"ID":"78ee48b5-a924-4253-8309-cdff7355ec6d","Type":"ContainerStarted","Data":"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.617932 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" event={"ID":"78ee48b5-a924-4253-8309-cdff7355ec6d","Type":"ContainerStarted","Data":"0365335113a7856626f812c1d4e964bc32e41e49c38672dd23b923aaed472789"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.618146 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.625100 4929 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h4pzk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.625156 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.628253 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" event={"ID":"3e7e5719-e329-4073-9ef4-6c26073399f5","Type":"ContainerStarted","Data":"15025832e1572701583fb45685ffa2b9f2449c00b06a2834c1fd061c6bd86893"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.640628 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.641030 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.141010114 +0000 UTC m=+155.691376488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.642052 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5nc7z" podStartSLOduration=130.642033739 podStartE2EDuration="2m10.642033739s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.639503603 +0000 UTC m=+155.189869967" watchObservedRunningTime="2025-10-02 11:12:34.642033739 +0000 UTC m=+155.192400103" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.676164 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" podStartSLOduration=130.676132175 podStartE2EDuration="2m10.676132175s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.675506193 +0000 UTC m=+155.225872557" watchObservedRunningTime="2025-10-02 11:12:34.676132175 +0000 UTC m=+155.226498539" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.701783 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" event={"ID":"cc152222-074c-4883-bb5f-f0a836a96023","Type":"ContainerStarted","Data":"27335184540d8540f6619896479f10caf0123a6806776241d36676eddf8e3ef9"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.725537 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sh99" event={"ID":"651a0356-18c1-46e1-a4c4-a4c409cd3a1e","Type":"ContainerStarted","Data":"7e7089605e1da3b42d9ad0f1a3c0461f089db90db32a27cb2cc34f9765d45816"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.742310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.742355 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" podStartSLOduration=130.742344668 podStartE2EDuration="2m10.742344668s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.741360354 +0000 UTC m=+155.291726718" watchObservedRunningTime="2025-10-02 11:12:34.742344668 +0000 UTC m=+155.292711032" Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.743547 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.243536439 +0000 UTC m=+155.793902803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.771072 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" event={"ID":"76d63ecb-462e-4002-8d76-b083b360a907","Type":"ContainerStarted","Data":"fb84c5b6fcbc359d98897f0952eb5c87174fc0311beae65dc231bda9f12ffa3c"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.771113 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" event={"ID":"76d63ecb-462e-4002-8d76-b083b360a907","Type":"ContainerStarted","Data":"cedff27293671acda7d25807e66e1d1158625b17215015b54ec986a9fe1236e1"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.802794 4929 generic.go:334] "Generic (PLEG): container finished" podID="ac426e70-652b-44e7-83ff-3a6c26942921" containerID="ab48787e5664508883a21d672fde5519dfcd8b627e0b97d677b63ce7430e1b4d" exitCode=0 Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.802909 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" event={"ID":"ac426e70-652b-44e7-83ff-3a6c26942921","Type":"ContainerDied","Data":"ab48787e5664508883a21d672fde5519dfcd8b627e0b97d677b63ce7430e1b4d"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.812599 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" event={"ID":"c43f5d6e-be64-4291-bd66-2548210bc566","Type":"ContainerStarted","Data":"cf6f2b5ebb5799d989b6ce3925bb7e753a7dce23f3382b325a2bfe2968a6f837"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.818618 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjjls" event={"ID":"ce2732c9-d9ca-4f15-a43c-a92186f698d8","Type":"ContainerStarted","Data":"aab1e078a154ac21e264c89836c54392825d31afd5e8675c6e8669bbd951ff5f"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.842680 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.843103 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.343063851 +0000 UTC m=+155.893430225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.845490 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" event={"ID":"edb0db23-72e4-4288-b8e6-84df2e5091b3","Type":"ContainerStarted","Data":"62764a2827b2a6b90ace16e74f30ecd0326e3efddecb0dcd2ccfa2f44f65833b"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.846239 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.852755 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.859681 4929 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tbxkj container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.859730 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" podUID="edb0db23-72e4-4288-b8e6-84df2e5091b3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.859916 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.359900186 +0000 UTC m=+155.910266550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.865764 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" event={"ID":"313a7d22-f379-4a34-8797-77a77c0ddc97","Type":"ContainerStarted","Data":"d9e1bfdb30b9c639216ed006a10dbbf0fe0d057cce69be4d98021c92937945b7"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.882280 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zjjls" podStartSLOduration=5.88226337 podStartE2EDuration="5.88226337s" podCreationTimestamp="2025-10-02 11:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.871814423 +0000 UTC m=+155.422180787" watchObservedRunningTime="2025-10-02 11:12:34.88226337 +0000 UTC m=+155.432629735" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.889657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" event={"ID":"0e1b7310-1ee8-4077-b873-a6a54f445381","Type":"ContainerStarted","Data":"40c6a1c5d9760407322ad745a05c06f3440eb36609c4bfd91f49341bf19aa183"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.889697 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" event={"ID":"0e1b7310-1ee8-4077-b873-a6a54f445381","Type":"ContainerStarted","Data":"b4ed1f0cdff69387f6cd133c3cbc0e35445ea4f181025c491c371d6f80ec3b04"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.897668 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-sqw5b" event={"ID":"380e06aa-5ba0-4ced-9d0a-6d7e4ec143ba","Type":"ContainerStarted","Data":"20f20affbe8d6ca0f6c959712610f3273da2f6a11fc66ca9342268e29cbdf333"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.903949 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" event={"ID":"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4","Type":"ContainerStarted","Data":"ab7ed7555127965d9b5a07ac2d5dee462b5c7de042d3655a790caeccb3b2b2f6"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.904007 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" event={"ID":"ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4","Type":"ContainerStarted","Data":"865b80e222a1bc731fe1787f3a7ec643f7b91c0f7e007b94d88aa095397a664f"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.904999 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.908809 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" event={"ID":"b9794854-0c90-4d9a-a57f-3daf45d35d8c","Type":"ContainerStarted","Data":"73d46e28a18d2ccaba8034d07828b83baa12817293707f807a48ae43fbf0235f"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.913669 4929 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-q6pdb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.913865 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" podUID="ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.917374 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" event={"ID":"cd451cea-3090-4483-9141-a2f45ece2ce2","Type":"ContainerStarted","Data":"966205a4a97e190ffbb60406541f43dd65c735145dfe6406ee9edf546fb0eb94"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.917611 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" event={"ID":"cd451cea-3090-4483-9141-a2f45ece2ce2","Type":"ContainerStarted","Data":"86fc6157f59a51130f68c61f947a82b2bec51e9ddd716262b8bd81152da1f18c"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.919330 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4hh6l" podStartSLOduration=130.919318857 podStartE2EDuration="2m10.919318857s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.918766738 +0000 UTC m=+155.469133102" watchObservedRunningTime="2025-10-02 11:12:34.919318857 +0000 UTC m=+155.469685221" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.922982 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" podStartSLOduration=129.922967992 podStartE2EDuration="2m9.922967992s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.896205057 +0000 UTC m=+155.446571421" watchObservedRunningTime="2025-10-02 11:12:34.922967992 +0000 UTC m=+155.473334356" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.944484 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" event={"ID":"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9","Type":"ContainerStarted","Data":"e3e9b346f1b96a9f042d0441557cb23b28bf4a7a90487a1007c4e77edef2fee5"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.944533 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" event={"ID":"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9","Type":"ContainerStarted","Data":"efbbf8e5ce575d529bdb2bb0307e24380b319fcba27bfc6287183ff3ad30ef48"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.954279 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-sqw5b" podStartSLOduration=5.954264042 podStartE2EDuration="5.954264042s" podCreationTimestamp="2025-10-02 11:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:34.953109472 +0000 UTC m=+155.503475836" watchObservedRunningTime="2025-10-02 11:12:34.954264042 +0000 UTC m=+155.504630406" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.960793 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:34 crc kubenswrapper[4929]: E1002 11:12:34.961356 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.461342434 +0000 UTC m=+156.011708798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.983399 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" event={"ID":"b4809e56-3d51-4508-954c-df41db145ee7","Type":"ContainerStarted","Data":"a193f125cac42c24a94f9cfbd27d0a25d7498ab940b500b9e1e3da2d7ae83dcf"} Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.993339 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gpcbf" Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.994273 4929 patch_prober.go:28] interesting pod/downloads-7954f5f757-qxkdv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 02 11:12:34 crc kubenswrapper[4929]: I1002 11:12:34.994307 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qxkdv" podUID="854dc2ee-5769-484e-a9a4-69a592dcaac1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.038370 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-sdssd" podStartSLOduration=131.038351126 podStartE2EDuration="2m11.038351126s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:35.004780458 +0000 UTC m=+155.555146822" watchObservedRunningTime="2025-10-02 11:12:35.038351126 +0000 UTC m=+155.588717490" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.067907 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.070531 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.570510185 +0000 UTC m=+156.120876599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.084315 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" podStartSLOduration=130.084292846 podStartE2EDuration="2m10.084292846s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:35.050574454 +0000 UTC m=+155.600940818" watchObservedRunningTime="2025-10-02 11:12:35.084292846 +0000 UTC m=+155.634659210" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.140389 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g28s8" podStartSLOduration=131.140368423 podStartE2EDuration="2m11.140368423s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:35.139563635 +0000 UTC m=+155.689929999" watchObservedRunningTime="2025-10-02 11:12:35.140368423 +0000 UTC m=+155.690734807" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.140765 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" podStartSLOduration=131.140759046 podStartE2EDuration="2m11.140759046s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:35.111212906 +0000 UTC m=+155.661579270" watchObservedRunningTime="2025-10-02 11:12:35.140759046 +0000 UTC m=+155.691125410" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.143826 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.172445 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.173417 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.673396292 +0000 UTC m=+156.223762656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.192098 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.193075 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.269241 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.281319 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.294256 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.794232342 +0000 UTC m=+156.344598706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.382438 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.382823 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.88280764 +0000 UTC m=+156.433174004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.384699 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:35 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:35 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:35 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.384747 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.445426 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.445725 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.483403 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.483733 4929 patch_prober.go:28] interesting pod/apiserver-76f77b778f-hzmkl container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]log ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]etcd ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/max-in-flight-filter ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 11:12:35 crc kubenswrapper[4929]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 11:12:35 crc kubenswrapper[4929]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/openshift.io-startinformers ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 11:12:35 crc kubenswrapper[4929]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 11:12:35 crc kubenswrapper[4929]: livez check failed Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.483764 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" podUID="c24e4fe9-f50e-4989-9316-e17e05b64acc" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.484014 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.983996878 +0000 UTC m=+156.534363242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.584381 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.584676 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.084654409 +0000 UTC m=+156.635020773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.584733 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.585067 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.085060353 +0000 UTC m=+156.635426717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.685219 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.685471 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.185455465 +0000 UTC m=+156.735821829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.786649 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.787015 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.286995555 +0000 UTC m=+156.837361919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.887446 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.887765 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.387743649 +0000 UTC m=+156.938110013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.988531 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:35 crc kubenswrapper[4929]: E1002 11:12:35.988862 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.488842745 +0000 UTC m=+157.039209109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.989436 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sh99" event={"ID":"651a0356-18c1-46e1-a4c4-a4c409cd3a1e","Type":"ContainerStarted","Data":"e1b3cc29e4f9dafaf680a7a256344ff83be6079528a991a09ca5fb41d9952eb8"} Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.989492 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4sh99" event={"ID":"651a0356-18c1-46e1-a4c4-a4c409cd3a1e","Type":"ContainerStarted","Data":"c5d2f472d6735d5408ac986eeab8b0645b88afdfce45a1d762db47705dfce08c"} Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.989510 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.991599 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" event={"ID":"76d63ecb-462e-4002-8d76-b083b360a907","Type":"ContainerStarted","Data":"57c1f6ad4a67867bd5638cc510bf90f3f078fbfd17ecebd68cca60e2d69d9065"} Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.994014 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" event={"ID":"3e7e5719-e329-4073-9ef4-6c26073399f5","Type":"ContainerStarted","Data":"5d6a7248a8af8c6970905d84f30f1b33661dbe362d561b12b6cf6db118e8a158"} Oct 02 11:12:35 crc kubenswrapper[4929]: I1002 11:12:35.996653 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjjls" event={"ID":"ce2732c9-d9ca-4f15-a43c-a92186f698d8","Type":"ContainerStarted","Data":"523618c21333464eb28a32c4b3620398a686247fb221cc0019753dd536fc71a1"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.000125 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" event={"ID":"00ee1580-a680-421c-8e39-edd438584800","Type":"ContainerStarted","Data":"9173bae2c0a22a079928e5f1a8fad4b9f8e1f323945ac3ad8b1ede3abd284188"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.013632 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" event={"ID":"0e1b7310-1ee8-4077-b873-a6a54f445381","Type":"ContainerStarted","Data":"b9ff26747645714816eff51fc0062d2ca77df7fe305af4154db5401f332965b7"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.014140 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.026873 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" event={"ID":"e4a44609-f273-4180-b294-1362acf3d0af","Type":"ContainerStarted","Data":"76e6926ef3ff6181df769dc7aade2db7e9dae7c29f737f68eb728efb08341131"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.032068 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" event={"ID":"c43f5d6e-be64-4291-bd66-2548210bc566","Type":"ContainerStarted","Data":"15d30e0ef3499fbd8b0cde21a33a4655fffb309245eab89915c524a69128a50b"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.040841 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" event={"ID":"d6016b7e-1424-416e-bcd9-a3490eec1493","Type":"ContainerStarted","Data":"6ab2e9de6f703690af4a50429647ad5186aa2671aae390b91ab3f56a15661cf6"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.051312 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" event={"ID":"9118fbfd-e206-451a-b873-7041e07207a4","Type":"ContainerStarted","Data":"4ea51367a7fa55f19c2caae0e13cbf0272216d5e43c859db5f51c750f31ae561"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.051347 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" event={"ID":"9118fbfd-e206-451a-b873-7041e07207a4","Type":"ContainerStarted","Data":"2dfaa41f85eab6290e18cbafb424c34cb94945292b27fc712d28122351a1f686"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.055163 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" event={"ID":"029e08ec-dabe-4a49-8ec6-b1b22999b713","Type":"ContainerStarted","Data":"c3d3fb69ad857644a6042b5268a9e60001475062a1eb8e54cf2a2e6f5cc034ac"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.060479 4929 generic.go:334] "Generic (PLEG): container finished" podID="cc152222-074c-4883-bb5f-f0a836a96023" containerID="27335184540d8540f6619896479f10caf0123a6806776241d36676eddf8e3ef9" exitCode=0 Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.060667 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" event={"ID":"cc152222-074c-4883-bb5f-f0a836a96023","Type":"ContainerDied","Data":"27335184540d8540f6619896479f10caf0123a6806776241d36676eddf8e3ef9"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.063170 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lrrcl" event={"ID":"bb86ef9b-bea0-4fb7-b11b-804e84be19cf","Type":"ContainerStarted","Data":"8293c756cdc293da2dd020379da2ceffbd80ddfb3cf84e2c5fcf4f8cca1c5876"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.069246 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" event={"ID":"b9794854-0c90-4d9a-a57f-3daf45d35d8c","Type":"ContainerStarted","Data":"66e78dc49a2f4c015b6cf1825fbb78876d9f59179ed33f9bbcc4e13043b7e51f"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.073927 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" event={"ID":"ac426e70-652b-44e7-83ff-3a6c26942921","Type":"ContainerStarted","Data":"c3d7f2026114b6860f5cebc0e9709fe60f0cd6748f149ed5643ed932c2dba9b1"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.074023 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.077390 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rbgn4" podStartSLOduration=132.077371371 podStartE2EDuration="2m12.077371371s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.069927886 +0000 UTC m=+156.620294250" watchObservedRunningTime="2025-10-02 11:12:36.077371371 +0000 UTC m=+156.627737735" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.079419 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4sh99" podStartSLOduration=7.07940566 podStartE2EDuration="7.07940566s" podCreationTimestamp="2025-10-02 11:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.056044172 +0000 UTC m=+156.606410546" watchObservedRunningTime="2025-10-02 11:12:36.07940566 +0000 UTC m=+156.629772024" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.081868 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fvqtg" event={"ID":"2a8296b7-a5ae-4091-a32b-0fe704cbc3f9","Type":"ContainerStarted","Data":"681525e55a67b5c714daa7c88f7cfada1859c528fcf013819fa2d023e37dc643"} Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.090142 4929 patch_prober.go:28] interesting pod/downloads-7954f5f757-qxkdv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.090181 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qxkdv" podUID="854dc2ee-5769-484e-a9a4-69a592dcaac1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.090234 4929 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7wfcz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.090290 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.091440 4929 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h4pzk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.091471 4929 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-q6pdb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.091494 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" podUID="ce1dc2e1-e9b1-49a0-aa67-39e3ac2eded4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.091488 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.092035 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.092330 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.592312421 +0000 UTC m=+157.142678775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.093548 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.095468 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.595454379 +0000 UTC m=+157.145820833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.100886 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-mpbk4" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.100898 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ln27l" podStartSLOduration=132.100879144 podStartE2EDuration="2m12.100879144s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.099994174 +0000 UTC m=+156.650360548" watchObservedRunningTime="2025-10-02 11:12:36.100879144 +0000 UTC m=+156.651245508" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.102232 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tbxkj" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.121844 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sp9ft" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.137137 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" podStartSLOduration=131.137119433 podStartE2EDuration="2m11.137119433s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.135761376 +0000 UTC m=+156.686127750" watchObservedRunningTime="2025-10-02 11:12:36.137119433 +0000 UTC m=+156.687485797" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.194871 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6q5rh" podStartSLOduration=132.194857296 podStartE2EDuration="2m12.194857296s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.158937389 +0000 UTC m=+156.709303753" watchObservedRunningTime="2025-10-02 11:12:36.194857296 +0000 UTC m=+156.745223660" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.195644 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4c4v9" podStartSLOduration=132.195640823 podStartE2EDuration="2m12.195640823s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.194317558 +0000 UTC m=+156.744683922" watchObservedRunningTime="2025-10-02 11:12:36.195640823 +0000 UTC m=+156.746007187" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.202607 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.203520 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.703504092 +0000 UTC m=+157.253870456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.245191 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pxt8j" podStartSLOduration=131.245170726 podStartE2EDuration="2m11.245170726s" podCreationTimestamp="2025-10-02 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.240518727 +0000 UTC m=+156.790885101" watchObservedRunningTime="2025-10-02 11:12:36.245170726 +0000 UTC m=+156.795537090" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.269941 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nqrd9" podStartSLOduration=132.269924932 podStartE2EDuration="2m12.269924932s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.267178098 +0000 UTC m=+156.817544462" watchObservedRunningTime="2025-10-02 11:12:36.269924932 +0000 UTC m=+156.820291296" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.304084 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.304360 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.804349779 +0000 UTC m=+157.354716143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.326350 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7r78f" podStartSLOduration=132.32633426 podStartE2EDuration="2m12.32633426s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.303263322 +0000 UTC m=+156.853629696" watchObservedRunningTime="2025-10-02 11:12:36.32633426 +0000 UTC m=+156.876700624" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.326742 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rch4g" podStartSLOduration=132.326738314 podStartE2EDuration="2m12.326738314s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.324787348 +0000 UTC m=+156.875153712" watchObservedRunningTime="2025-10-02 11:12:36.326738314 +0000 UTC m=+156.877104668" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.374893 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:36 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:36 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:36 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.374973 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.376396 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" podStartSLOduration=132.376379601 podStartE2EDuration="2m12.376379601s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:36.375412958 +0000 UTC m=+156.925779322" watchObservedRunningTime="2025-10-02 11:12:36.376379601 +0000 UTC m=+156.926745965" Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.405376 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.405672 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:36.905656862 +0000 UTC m=+157.456023226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.506410 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.506748 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.006732776 +0000 UTC m=+157.557099140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.607289 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.607487 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.107458829 +0000 UTC m=+157.657825203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.708967 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.709269 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.209258158 +0000 UTC m=+157.759624512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.809817 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.809918 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.309901218 +0000 UTC m=+157.860267582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.810099 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.810359 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.310350344 +0000 UTC m=+157.860716718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.911082 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.911236 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.411208081 +0000 UTC m=+157.961574435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:36 crc kubenswrapper[4929]: I1002 11:12:36.911315 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:36 crc kubenswrapper[4929]: E1002 11:12:36.911595 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.411583364 +0000 UTC m=+157.961949728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.012450 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.012658 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.512630578 +0000 UTC m=+158.062996942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.012855 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.013192 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.513184057 +0000 UTC m=+158.063550421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.094313 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" event={"ID":"b9794854-0c90-4d9a-a57f-3daf45d35d8c","Type":"ContainerStarted","Data":"4758e2ed877df9cf9634d5f4bb9ab3fc4196f0637f26c67019a193f6101aabaf"} Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.101968 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.114195 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.114289 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.614268142 +0000 UTC m=+158.164634506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.115754 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.118208 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.618194036 +0000 UTC m=+158.168560390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.119912 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-q6pdb" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.218190 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.218538 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.718521615 +0000 UTC m=+158.268887979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.319825 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.320245 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.820220912 +0000 UTC m=+158.370587276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.367562 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:37 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:37 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:37 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.367645 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.383036 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.384023 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.389746 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.401556 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.421979 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.422140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.422187 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.422243 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5nw\" (UniqueName: \"kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.422349 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.922330242 +0000 UTC m=+158.472696616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.453108 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.523257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.523312 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.523366 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.523423 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5nw\" (UniqueName: \"kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.524115 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.0241004 +0000 UTC m=+158.574466764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.524584 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.524792 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.540578 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5nw\" (UniqueName: \"kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw\") pod \"community-operators-fddvw\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.575542 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.575737 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc152222-074c-4883-bb5f-f0a836a96023" containerName="collect-profiles" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.575750 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc152222-074c-4883-bb5f-f0a836a96023" containerName="collect-profiles" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.575835 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc152222-074c-4883-bb5f-f0a836a96023" containerName="collect-profiles" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.576424 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.581862 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.595365 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.626247 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume\") pod \"cc152222-074c-4883-bb5f-f0a836a96023\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.626311 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrxdj\" (UniqueName: \"kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj\") pod \"cc152222-074c-4883-bb5f-f0a836a96023\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.626651 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.126626485 +0000 UTC m=+158.676992839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.626747 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.627005 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume\") pod \"cc152222-074c-4883-bb5f-f0a836a96023\" (UID: \"cc152222-074c-4883-bb5f-f0a836a96023\") " Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.637692 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc152222-074c-4883-bb5f-f0a836a96023" (UID: "cc152222-074c-4883-bb5f-f0a836a96023"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.638343 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.138330495 +0000 UTC m=+158.688696859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.627676 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.641057 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.641142 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tgc\" (UniqueName: \"kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.641249 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.641295 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc152222-074c-4883-bb5f-f0a836a96023-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.642358 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc152222-074c-4883-bb5f-f0a836a96023" (UID: "cc152222-074c-4883-bb5f-f0a836a96023"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.644205 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj" (OuterVolumeSpecName: "kube-api-access-hrxdj") pod "cc152222-074c-4883-bb5f-f0a836a96023" (UID: "cc152222-074c-4883-bb5f-f0a836a96023"). InnerVolumeSpecName "kube-api-access-hrxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.741814 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.742242 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.242207306 +0000 UTC m=+158.792573680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742302 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742418 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742511 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742598 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67tgc\" (UniqueName: \"kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742677 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc152222-074c-4883-bb5f-f0a836a96023-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.742698 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrxdj\" (UniqueName: \"kubernetes.io/projected/cc152222-074c-4883-bb5f-f0a836a96023-kube-api-access-hrxdj\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.743064 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.243049894 +0000 UTC m=+158.793416258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.743463 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.743548 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.749599 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.763874 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tgc\" (UniqueName: \"kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc\") pod \"certified-operators-7nz9s\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.774551 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.775462 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.803274 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.843396 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.843616 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc88b\" (UniqueName: \"kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.843655 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.843674 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.843770 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.343753627 +0000 UTC m=+158.894119991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.894198 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.944870 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.945159 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.945237 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc88b\" (UniqueName: \"kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.945256 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:37 crc kubenswrapper[4929]: E1002 11:12:37.945516 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.445504114 +0000 UTC m=+158.995870478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.945951 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.946224 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.968759 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc88b\" (UniqueName: \"kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b\") pod \"community-operators-8lgqp\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.971902 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.976029 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:37 crc kubenswrapper[4929]: I1002 11:12:37.991244 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.008229 4929 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.051551 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.051747 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.051820 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.051843 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrbv\" (UniqueName: \"kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: E1002 11:12:38.052136 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.552116599 +0000 UTC m=+159.102482973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.082288 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.103895 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.147260 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" event={"ID":"cc152222-074c-4883-bb5f-f0a836a96023","Type":"ContainerDied","Data":"1c08c15156c4badbb0b11d3ced9473f2f0a6d8ba21edd3adfbaff1fa5c064ebb"} Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.147300 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c08c15156c4badbb0b11d3ced9473f2f0a6d8ba21edd3adfbaff1fa5c064ebb" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.147385 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.153592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.153661 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.153709 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.153727 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrbv\" (UniqueName: \"kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: E1002 11:12:38.154229 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.654217568 +0000 UTC m=+159.204583932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.154686 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.154905 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.192575 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" event={"ID":"b9794854-0c90-4d9a-a57f-3daf45d35d8c","Type":"ContainerStarted","Data":"bc804af1a7e64691a444b981f4ce2bf261b14f2456367b6c978568fad705f2b4"} Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.192611 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerStarted","Data":"9e9e16e41d8ed9f38a4e76512549395e6b35997fb632174d36050854ef6c5d0d"} Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.199904 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrbv\" (UniqueName: \"kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv\") pod \"certified-operators-tx64j\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.256173 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:38 crc kubenswrapper[4929]: E1002 11:12:38.258375 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.758354488 +0000 UTC m=+159.308720852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.353771 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.358840 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:38 crc kubenswrapper[4929]: E1002 11:12:38.359146 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 11:12:38.859134733 +0000 UTC m=+159.409501097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r7lmd" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.386774 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:38 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:38 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:38 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.386827 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.388354 4929 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T11:12:38.008253369Z","Handler":null,"Name":""} Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.399773 4929 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.399804 4929 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.460241 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.548658 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.562489 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.585997 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.595194 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.595231 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:38 crc kubenswrapper[4929]: W1002 11:12:38.606235 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2fc666_f2c4_4d6c_b250_58213fc0dd0c.slice/crio-08e4807299c2366575315b560b3c9ff8fab2f9cb649f14210e340220da2e86a7 WatchSource:0}: Error finding container 08e4807299c2366575315b560b3c9ff8fab2f9cb649f14210e340220da2e86a7: Status 404 returned error can't find the container with id 08e4807299c2366575315b560b3c9ff8fab2f9cb649f14210e340220da2e86a7 Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.648671 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.742598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r7lmd\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.772797 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:12:38 crc kubenswrapper[4929]: W1002 11:12:38.792918 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ba1996_79cb_44bb_a285_5603d1fc649e.slice/crio-251c782ecc42bf47904c5627f6d0aeb14d30eb1b8c27a8bb56a381c8936e889a WatchSource:0}: Error finding container 251c782ecc42bf47904c5627f6d0aeb14d30eb1b8c27a8bb56a381c8936e889a: Status 404 returned error can't find the container with id 251c782ecc42bf47904c5627f6d0aeb14d30eb1b8c27a8bb56a381c8936e889a Oct 02 11:12:38 crc kubenswrapper[4929]: I1002 11:12:38.874252 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.138260 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.168046 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" event={"ID":"94158d82-3849-4716-a7a8-61b0c6236d1e","Type":"ContainerStarted","Data":"7c2b236d0fcbbce94c99cd76a4a34d278b10cf04fc4532f90abcab48645b6ca1"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.179583 4929 generic.go:334] "Generic (PLEG): container finished" podID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerID="c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.179677 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerDied","Data":"c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.179716 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerStarted","Data":"ae7715b38fc729f6cb3a1587dd235d612b5472636eb0411a4028d4c843159bfd"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.181197 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.189252 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" event={"ID":"b9794854-0c90-4d9a-a57f-3daf45d35d8c","Type":"ContainerStarted","Data":"f82b087afcd513488b2786f961e9c2ed21b25777cf71a68c566d8210e220cb9c"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.198152 4929 generic.go:334] "Generic (PLEG): container finished" podID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerID="86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.199254 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerDied","Data":"86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.205828 4929 generic.go:334] "Generic (PLEG): container finished" podID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerID="da4620ad550e58f9236112b57ac2f3ebeeb8d5adfb38a9c88557ddf167508c23" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.206095 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerDied","Data":"da4620ad550e58f9236112b57ac2f3ebeeb8d5adfb38a9c88557ddf167508c23"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.206125 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerStarted","Data":"251c782ecc42bf47904c5627f6d0aeb14d30eb1b8c27a8bb56a381c8936e889a"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.213006 4929 generic.go:334] "Generic (PLEG): container finished" podID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerID="49ef16407cded8cf166396e5a4a5b0f12035ae410e91ca97662915c0320fa51b" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.213060 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerDied","Data":"49ef16407cded8cf166396e5a4a5b0f12035ae410e91ca97662915c0320fa51b"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.213090 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerStarted","Data":"08e4807299c2366575315b560b3c9ff8fab2f9cb649f14210e340220da2e86a7"} Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.233255 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kw7jx" podStartSLOduration=10.233236591 podStartE2EDuration="10.233236591s" podCreationTimestamp="2025-10-02 11:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:39.227752043 +0000 UTC m=+159.778118407" watchObservedRunningTime="2025-10-02 11:12:39.233236591 +0000 UTC m=+159.783602945" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.367899 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.368149 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:39 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:39 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:39 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.368208 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.368866 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.371287 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.384496 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.481102 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.481177 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z6g\" (UniqueName: \"kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.481202 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.582828 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.583220 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5z6g\" (UniqueName: \"kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.583351 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.583391 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.583895 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.602707 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5z6g\" (UniqueName: \"kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g\") pod \"redhat-marketplace-74dq9\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.691496 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.765319 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.766911 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.772761 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.786556 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.786638 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.786676 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlcfz\" (UniqueName: \"kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.870357 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.872101 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.874317 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.879449 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.880839 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.887368 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.887413 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlcfz\" (UniqueName: \"kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.887471 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.887496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.887607 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.888040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.888085 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.907151 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlcfz\" (UniqueName: \"kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz\") pod \"redhat-marketplace-rkvt7\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.907725 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.988552 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.988594 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:39 crc kubenswrapper[4929]: I1002 11:12:39.988901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.006614 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.091455 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.167217 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.193883 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.224390 4929 generic.go:334] "Generic (PLEG): container finished" podID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerID="3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7" exitCode=0 Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.224454 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerDied","Data":"3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7"} Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.224484 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerStarted","Data":"289bc6ade4379244edb614dd188a4aa00b2ceb5c1b9e942112c42019b4d4c3fc"} Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.234007 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" event={"ID":"94158d82-3849-4716-a7a8-61b0c6236d1e","Type":"ContainerStarted","Data":"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6"} Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.234168 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.339951 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" podStartSLOduration=136.339928607 podStartE2EDuration="2m16.339928607s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:40.26509732 +0000 UTC m=+160.815463824" watchObservedRunningTime="2025-10-02 11:12:40.339928607 +0000 UTC m=+160.890294971" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.340625 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:12:40 crc kubenswrapper[4929]: W1002 11:12:40.356764 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee96c3e5_aca5_4493_922c_72a05f3b6c93.slice/crio-0910d14039067f108eba0e411042d98b5dadbae33b364ea072c6d4f2886a552d WatchSource:0}: Error finding container 0910d14039067f108eba0e411042d98b5dadbae33b364ea072c6d4f2886a552d: Status 404 returned error can't find the container with id 0910d14039067f108eba0e411042d98b5dadbae33b364ea072c6d4f2886a552d Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.368748 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 11:12:40 crc kubenswrapper[4929]: [-]has-synced failed: reason withheld Oct 02 11:12:40 crc kubenswrapper[4929]: [+]process-running ok Oct 02 11:12:40 crc kubenswrapper[4929]: healthz check failed Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.368797 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.404674 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 11:12:40 crc kubenswrapper[4929]: W1002 11:12:40.429885 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod909f4df9_4b16_40ea_9671_1713cf023319.slice/crio-7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c WatchSource:0}: Error finding container 7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c: Status 404 returned error can't find the container with id 7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.449542 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.454449 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-hzmkl" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.562285 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.566799 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.569233 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.573348 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.598893 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.598980 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.599036 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxjn5\" (UniqueName: \"kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.600023 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-j8pwm" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.714630 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxjn5\" (UniqueName: \"kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.715038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.715106 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.715599 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.715933 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.750785 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxjn5\" (UniqueName: \"kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5\") pod \"redhat-operators-4gjxg\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.938347 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.965836 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.967732 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:40 crc kubenswrapper[4929]: I1002 11:12:40.972363 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.018816 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rq48\" (UniqueName: \"kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.018905 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.019021 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.120271 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.121433 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rq48\" (UniqueName: \"kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.121503 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.122648 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.122719 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.147396 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rq48\" (UniqueName: \"kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48\") pod \"redhat-operators-smh4r\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.209226 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255080 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"909f4df9-4b16-40ea-9671-1713cf023319","Type":"ContainerStarted","Data":"4be53866f39712a5cbe40073d3ccae6bf2612c3b491407192a0d156f8c57d2d8"} Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255142 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"909f4df9-4b16-40ea-9671-1713cf023319","Type":"ContainerStarted","Data":"7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c"} Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255307 4929 patch_prober.go:28] interesting pod/downloads-7954f5f757-qxkdv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255350 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qxkdv" podUID="854dc2ee-5769-484e-a9a4-69a592dcaac1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255758 4929 patch_prober.go:28] interesting pod/downloads-7954f5f757-qxkdv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.255790 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qxkdv" podUID="854dc2ee-5769-484e-a9a4-69a592dcaac1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.260915 4929 generic.go:334] "Generic (PLEG): container finished" podID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerID="ba59d9488f469693648b0e6b29e09e0810a950106f5881b50af741310a8f183d" exitCode=0 Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.261613 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerDied","Data":"ba59d9488f469693648b0e6b29e09e0810a950106f5881b50af741310a8f183d"} Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.261649 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerStarted","Data":"0910d14039067f108eba0e411042d98b5dadbae33b364ea072c6d4f2886a552d"} Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.286607 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.286589125 podStartE2EDuration="2.286589125s" podCreationTimestamp="2025-10-02 11:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:41.268186336 +0000 UTC m=+161.818552800" watchObservedRunningTime="2025-10-02 11:12:41.286589125 +0000 UTC m=+161.836955489" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.303803 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.364480 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.368184 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.592038 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.592370 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.595200 4929 patch_prober.go:28] interesting pod/console-f9d7485db-zc6nf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.595263 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zc6nf" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.662701 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:12:41 crc kubenswrapper[4929]: W1002 11:12:41.684377 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1257b93_1170_45a1_b46b_a6bb3c4a2bad.slice/crio-f9f9f31ad5d348d5dbd710e89107827f0616e15a488bf73995ea733bdf1e23b4 WatchSource:0}: Error finding container f9f9f31ad5d348d5dbd710e89107827f0616e15a488bf73995ea733bdf1e23b4: Status 404 returned error can't find the container with id f9f9f31ad5d348d5dbd710e89107827f0616e15a488bf73995ea733bdf1e23b4 Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.718317 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.719586 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.722237 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.723930 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.729397 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.729493 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.741407 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.830842 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.831473 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.831015 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:41 crc kubenswrapper[4929]: I1002 11:12:41.858331 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.048422 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.284411 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerID="f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4" exitCode=0 Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.284473 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerDied","Data":"f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4"} Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.284528 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerStarted","Data":"0d6e00751c439799d93ec4ca120bb307ea5a833a15eeb8406c6aa0267dd681e7"} Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.294709 4929 generic.go:334] "Generic (PLEG): container finished" podID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerID="5b46c16239990a924b60c6ca6a9fa834da0ba72fd2cb50eb2926e3e4628b32f6" exitCode=0 Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.294837 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerDied","Data":"5b46c16239990a924b60c6ca6a9fa834da0ba72fd2cb50eb2926e3e4628b32f6"} Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.294868 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerStarted","Data":"f9f9f31ad5d348d5dbd710e89107827f0616e15a488bf73995ea733bdf1e23b4"} Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.318739 4929 generic.go:334] "Generic (PLEG): container finished" podID="909f4df9-4b16-40ea-9671-1713cf023319" containerID="4be53866f39712a5cbe40073d3ccae6bf2612c3b491407192a0d156f8c57d2d8" exitCode=0 Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.318887 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"909f4df9-4b16-40ea-9671-1713cf023319","Type":"ContainerDied","Data":"4be53866f39712a5cbe40073d3ccae6bf2612c3b491407192a0d156f8c57d2d8"} Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.320745 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.323591 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qxjwx" Oct 02 11:12:42 crc kubenswrapper[4929]: W1002 11:12:42.336108 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc9578a33_3b82_4b8b_8c9a_6701c0422602.slice/crio-d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92 WatchSource:0}: Error finding container d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92: Status 404 returned error can't find the container with id d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92 Oct 02 11:12:42 crc kubenswrapper[4929]: I1002 11:12:42.351296 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.331878 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9578a33-3b82-4b8b-8c9a-6701c0422602","Type":"ContainerStarted","Data":"15b7803a8ea6672455a8618aa388e4fb8c82d5576b105c607522b933fd3218be"} Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.332203 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9578a33-3b82-4b8b-8c9a-6701c0422602","Type":"ContainerStarted","Data":"d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92"} Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.684621 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.700281 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.700263816 podStartE2EDuration="2.700263816s" podCreationTimestamp="2025-10-02 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:43.350948938 +0000 UTC m=+163.901315312" watchObservedRunningTime="2025-10-02 11:12:43.700263816 +0000 UTC m=+164.250630180" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.764758 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access\") pod \"909f4df9-4b16-40ea-9671-1713cf023319\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.764807 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir\") pod \"909f4df9-4b16-40ea-9671-1713cf023319\" (UID: \"909f4df9-4b16-40ea-9671-1713cf023319\") " Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.765060 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "909f4df9-4b16-40ea-9671-1713cf023319" (UID: "909f4df9-4b16-40ea-9671-1713cf023319"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.774215 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "909f4df9-4b16-40ea-9671-1713cf023319" (UID: "909f4df9-4b16-40ea-9671-1713cf023319"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.866331 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/909f4df9-4b16-40ea-9671-1713cf023319-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:43 crc kubenswrapper[4929]: I1002 11:12:43.866368 4929 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/909f4df9-4b16-40ea-9671-1713cf023319-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.347356 4929 generic.go:334] "Generic (PLEG): container finished" podID="c9578a33-3b82-4b8b-8c9a-6701c0422602" containerID="15b7803a8ea6672455a8618aa388e4fb8c82d5576b105c607522b933fd3218be" exitCode=0 Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.347408 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9578a33-3b82-4b8b-8c9a-6701c0422602","Type":"ContainerDied","Data":"15b7803a8ea6672455a8618aa388e4fb8c82d5576b105c607522b933fd3218be"} Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.355753 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"909f4df9-4b16-40ea-9671-1713cf023319","Type":"ContainerDied","Data":"7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c"} Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.355788 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.355796 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7359b3024c990a14365a098b4e9f3edb94652c8518530b9e0aac87920f36254c" Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.456291 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4sh99" Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.736917 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:12:44 crc kubenswrapper[4929]: I1002 11:12:44.736997 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:12:47 crc kubenswrapper[4929]: I1002 11:12:47.120632 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:47 crc kubenswrapper[4929]: I1002 11:12:47.145579 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49-metrics-certs\") pod \"network-metrics-daemon-59lbt\" (UID: \"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49\") " pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:47 crc kubenswrapper[4929]: I1002 11:12:47.292497 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-59lbt" Oct 02 11:12:51 crc kubenswrapper[4929]: I1002 11:12:51.264857 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qxkdv" Oct 02 11:12:51 crc kubenswrapper[4929]: I1002 11:12:51.607698 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:51 crc kubenswrapper[4929]: I1002 11:12:51.613286 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:12:57 crc kubenswrapper[4929]: I1002 11:12:57.253107 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 11:12:58 crc kubenswrapper[4929]: I1002 11:12:58.882085 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:13:02 crc kubenswrapper[4929]: I1002 11:13:02.404294 4929 patch_prober.go:28] interesting pod/router-default-5444994796-qxjwx container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 11:13:02 crc kubenswrapper[4929]: I1002 11:13:02.404801 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-qxjwx" podUID="a34747b6-0f96-47d6-a12b-7fb4c40b5bab" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.533043 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.537437 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9578a33-3b82-4b8b-8c9a-6701c0422602","Type":"ContainerDied","Data":"d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92"} Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.537486 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f1c2b184412857d794b6006246b75ad5ae87090ab8d716b44984d32eb21c92" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.642813 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access\") pod \"c9578a33-3b82-4b8b-8c9a-6701c0422602\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.643046 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir\") pod \"c9578a33-3b82-4b8b-8c9a-6701c0422602\" (UID: \"c9578a33-3b82-4b8b-8c9a-6701c0422602\") " Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.643495 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c9578a33-3b82-4b8b-8c9a-6701c0422602" (UID: "c9578a33-3b82-4b8b-8c9a-6701c0422602"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.651775 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c9578a33-3b82-4b8b-8c9a-6701c0422602" (UID: "c9578a33-3b82-4b8b-8c9a-6701c0422602"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.745335 4929 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9578a33-3b82-4b8b-8c9a-6701c0422602-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:09 crc kubenswrapper[4929]: I1002 11:13:09.745408 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9578a33-3b82-4b8b-8c9a-6701c0422602-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:10 crc kubenswrapper[4929]: I1002 11:13:10.544395 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 11:13:12 crc kubenswrapper[4929]: I1002 11:13:12.358417 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hnftv" Oct 02 11:13:14 crc kubenswrapper[4929]: I1002 11:13:14.736955 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:13:14 crc kubenswrapper[4929]: I1002 11:13:14.737322 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:13:31 crc kubenswrapper[4929]: E1002 11:13:31.551209 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 11:13:31 crc kubenswrapper[4929]: E1002 11:13:31.551891 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-74dq9_openshift-marketplace(e8e362b3-7e54-4408-ad25-b2c32c0aa3bc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:31 crc kubenswrapper[4929]: E1002 11:13:31.552972 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-74dq9" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" Oct 02 11:13:37 crc kubenswrapper[4929]: E1002 11:13:37.676038 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-74dq9" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" Oct 02 11:13:38 crc kubenswrapper[4929]: E1002 11:13:38.186107 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 11:13:38 crc kubenswrapper[4929]: E1002 11:13:38.186541 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxrbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tx64j_openshift-marketplace(19ba1996-79cb-44bb-a285-5603d1fc649e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:38 crc kubenswrapper[4929]: E1002 11:13:38.187736 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tx64j" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" Oct 02 11:13:44 crc kubenswrapper[4929]: E1002 11:13:44.628745 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tx64j" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" Oct 02 11:13:44 crc kubenswrapper[4929]: I1002 11:13:44.736729 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:13:44 crc kubenswrapper[4929]: I1002 11:13:44.736816 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:13:44 crc kubenswrapper[4929]: I1002 11:13:44.736878 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:13:44 crc kubenswrapper[4929]: I1002 11:13:44.738425 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:13:44 crc kubenswrapper[4929]: I1002 11:13:44.738668 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512" gracePeriod=600 Oct 02 11:13:46 crc kubenswrapper[4929]: I1002 11:13:46.748312 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512" exitCode=0 Oct 02 11:13:46 crc kubenswrapper[4929]: I1002 11:13:46.748373 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512"} Oct 02 11:13:46 crc kubenswrapper[4929]: E1002 11:13:46.994109 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 11:13:46 crc kubenswrapper[4929]: E1002 11:13:46.994708 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dlcfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rkvt7_openshift-marketplace(ee96c3e5-aca5-4493-922c-72a05f3b6c93): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:46 crc kubenswrapper[4929]: E1002 11:13:46.995919 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rkvt7" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.171639 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rkvt7" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.245154 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.245298 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9p5nw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fddvw_openshift-marketplace(4e5b0e1b-b379-42e5-a7aa-56a1736771ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.249085 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fddvw" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.258580 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.258761 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rq48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-smh4r_openshift-marketplace(b1257b93-1170-45a1-b46b-a6bb3c4a2bad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.260110 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-smh4r" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.338590 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.338729 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc88b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8lgqp_openshift-marketplace(8e2fc666-f2c4-4d6c-b250-58213fc0dd0c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.340078 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8lgqp" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.348603 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.348753 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxjn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4gjxg_openshift-marketplace(f6a1f6f5-57b7-40f5-a463-356426589a84): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.350582 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4gjxg" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.385638 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.385994 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67tgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7nz9s_openshift-marketplace(46a28329-6450-46ac-b889-ec17f4aca6f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.387818 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7nz9s" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" Oct 02 11:13:48 crc kubenswrapper[4929]: I1002 11:13:48.663739 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-59lbt"] Oct 02 11:13:48 crc kubenswrapper[4929]: I1002 11:13:48.765510 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59lbt" event={"ID":"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49","Type":"ContainerStarted","Data":"3a7686e2581bc11e78ddcf69153f7b3a220b9b442922fa0994e143d6a6a8a575"} Oct 02 11:13:48 crc kubenswrapper[4929]: I1002 11:13:48.769119 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee"} Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.771928 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7nz9s" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.771999 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8lgqp" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.772055 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fddvw" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.772164 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-smh4r" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" Oct 02 11:13:48 crc kubenswrapper[4929]: E1002 11:13:48.773148 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4gjxg" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" Oct 02 11:13:49 crc kubenswrapper[4929]: I1002 11:13:49.775723 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59lbt" event={"ID":"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49","Type":"ContainerStarted","Data":"b59959b9a0c70d5324712146fdaf8a9fe4cf81c059656c9622fa864ceb445cc0"} Oct 02 11:13:49 crc kubenswrapper[4929]: I1002 11:13:49.776292 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-59lbt" event={"ID":"1ba53e06-16e6-4e9f-9e29-c0c2bcc74e49","Type":"ContainerStarted","Data":"7196eb0a83e2630c08359d9b7d71f07588a26804dc3f8531674b6ab0019ef5dc"} Oct 02 11:13:51 crc kubenswrapper[4929]: I1002 11:13:51.788740 4929 generic.go:334] "Generic (PLEG): container finished" podID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerID="bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7" exitCode=0 Oct 02 11:13:51 crc kubenswrapper[4929]: I1002 11:13:51.788817 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerDied","Data":"bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7"} Oct 02 11:13:51 crc kubenswrapper[4929]: I1002 11:13:51.807569 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-59lbt" podStartSLOduration=207.807551265 podStartE2EDuration="3m27.807551265s" podCreationTimestamp="2025-10-02 11:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:49.790550076 +0000 UTC m=+230.340916480" watchObservedRunningTime="2025-10-02 11:13:51.807551265 +0000 UTC m=+232.357917629" Oct 02 11:13:52 crc kubenswrapper[4929]: I1002 11:13:52.799415 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerStarted","Data":"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e"} Oct 02 11:13:52 crc kubenswrapper[4929]: I1002 11:13:52.816747 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-74dq9" podStartSLOduration=1.819508045 podStartE2EDuration="1m13.816725585s" podCreationTimestamp="2025-10-02 11:12:39 +0000 UTC" firstStartedPulling="2025-10-02 11:12:40.2267668 +0000 UTC m=+160.777133164" lastFinishedPulling="2025-10-02 11:13:52.22398434 +0000 UTC m=+232.774350704" observedRunningTime="2025-10-02 11:13:52.816314053 +0000 UTC m=+233.366680417" watchObservedRunningTime="2025-10-02 11:13:52.816725585 +0000 UTC m=+233.367091949" Oct 02 11:13:57 crc kubenswrapper[4929]: I1002 11:13:57.835312 4929 generic.go:334] "Generic (PLEG): container finished" podID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerID="80ab1bf356d9169cd1b4c521e219436de835fa17a0a3a6cf9f752acdf61b573b" exitCode=0 Oct 02 11:13:57 crc kubenswrapper[4929]: I1002 11:13:57.835428 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerDied","Data":"80ab1bf356d9169cd1b4c521e219436de835fa17a0a3a6cf9f752acdf61b573b"} Oct 02 11:13:58 crc kubenswrapper[4929]: I1002 11:13:58.845407 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerStarted","Data":"370768a472410f2d337544d58e6b3067b53ff82c58c7c926f98ffb8b87953ea7"} Oct 02 11:13:58 crc kubenswrapper[4929]: I1002 11:13:58.878318 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tx64j" podStartSLOduration=2.820811774 podStartE2EDuration="1m21.878298427s" podCreationTimestamp="2025-10-02 11:12:37 +0000 UTC" firstStartedPulling="2025-10-02 11:12:39.207221761 +0000 UTC m=+159.757588125" lastFinishedPulling="2025-10-02 11:13:58.264708414 +0000 UTC m=+238.815074778" observedRunningTime="2025-10-02 11:13:58.875719938 +0000 UTC m=+239.426086312" watchObservedRunningTime="2025-10-02 11:13:58.878298427 +0000 UTC m=+239.428664801" Oct 02 11:13:59 crc kubenswrapper[4929]: I1002 11:13:59.692432 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:13:59 crc kubenswrapper[4929]: I1002 11:13:59.692835 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:13:59 crc kubenswrapper[4929]: I1002 11:13:59.865527 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:13:59 crc kubenswrapper[4929]: I1002 11:13:59.904888 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:14:04 crc kubenswrapper[4929]: I1002 11:14:04.895000 4929 generic.go:334] "Generic (PLEG): container finished" podID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerID="cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96" exitCode=0 Oct 02 11:14:04 crc kubenswrapper[4929]: I1002 11:14:04.895196 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerDied","Data":"cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96"} Oct 02 11:14:04 crc kubenswrapper[4929]: I1002 11:14:04.899073 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerStarted","Data":"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099"} Oct 02 11:14:04 crc kubenswrapper[4929]: I1002 11:14:04.904462 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerStarted","Data":"50cbdf8863a91a563374f794f77d155e0e0e7fecfc94cd80e598b18c8521cc7b"} Oct 02 11:14:05 crc kubenswrapper[4929]: I1002 11:14:05.914212 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerID="53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099" exitCode=0 Oct 02 11:14:05 crc kubenswrapper[4929]: I1002 11:14:05.914323 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerDied","Data":"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099"} Oct 02 11:14:05 crc kubenswrapper[4929]: I1002 11:14:05.917543 4929 generic.go:334] "Generic (PLEG): container finished" podID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerID="50cbdf8863a91a563374f794f77d155e0e0e7fecfc94cd80e598b18c8521cc7b" exitCode=0 Oct 02 11:14:05 crc kubenswrapper[4929]: I1002 11:14:05.917594 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerDied","Data":"50cbdf8863a91a563374f794f77d155e0e0e7fecfc94cd80e598b18c8521cc7b"} Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.946631 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerStarted","Data":"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6"} Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.951947 4929 generic.go:334] "Generic (PLEG): container finished" podID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerID="9c90d1baa4300103df57e46fa9cd934ea9b95e6c5ba2e0e0b0efe2d48db25983" exitCode=0 Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.952099 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerDied","Data":"9c90d1baa4300103df57e46fa9cd934ea9b95e6c5ba2e0e0b0efe2d48db25983"} Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.954856 4929 generic.go:334] "Generic (PLEG): container finished" podID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerID="65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6" exitCode=0 Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.954950 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerDied","Data":"65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6"} Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.956615 4929 generic.go:334] "Generic (PLEG): container finished" podID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerID="8997a762674c93afbebcad8cc2765de54646bf0ed4622c4df8950ecef5a18ef8" exitCode=0 Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.956667 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerDied","Data":"8997a762674c93afbebcad8cc2765de54646bf0ed4622c4df8950ecef5a18ef8"} Oct 02 11:14:07 crc kubenswrapper[4929]: I1002 11:14:07.967419 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7nz9s" podStartSLOduration=3.664205344 podStartE2EDuration="1m30.967395451s" podCreationTimestamp="2025-10-02 11:12:37 +0000 UTC" firstStartedPulling="2025-10-02 11:12:39.18086333 +0000 UTC m=+159.731229694" lastFinishedPulling="2025-10-02 11:14:06.484053437 +0000 UTC m=+247.034419801" observedRunningTime="2025-10-02 11:14:07.964450171 +0000 UTC m=+248.514816545" watchObservedRunningTime="2025-10-02 11:14:07.967395451 +0000 UTC m=+248.517761815" Oct 02 11:14:08 crc kubenswrapper[4929]: I1002 11:14:08.354492 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:08 crc kubenswrapper[4929]: I1002 11:14:08.354637 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:08 crc kubenswrapper[4929]: I1002 11:14:08.405768 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:09 crc kubenswrapper[4929]: I1002 11:14:09.001089 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:09 crc kubenswrapper[4929]: I1002 11:14:09.970230 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerStarted","Data":"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c"} Oct 02 11:14:10 crc kubenswrapper[4929]: I1002 11:14:10.299036 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:14:11 crc kubenswrapper[4929]: I1002 11:14:11.979908 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tx64j" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="registry-server" containerID="cri-o://370768a472410f2d337544d58e6b3067b53ff82c58c7c926f98ffb8b87953ea7" gracePeriod=2 Oct 02 11:14:13 crc kubenswrapper[4929]: I1002 11:14:13.000448 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4gjxg" podStartSLOduration=7.180567299 podStartE2EDuration="1m33.00043153s" podCreationTimestamp="2025-10-02 11:12:40 +0000 UTC" firstStartedPulling="2025-10-02 11:12:42.287232159 +0000 UTC m=+162.837598523" lastFinishedPulling="2025-10-02 11:14:08.1070964 +0000 UTC m=+248.657462754" observedRunningTime="2025-10-02 11:14:12.999691857 +0000 UTC m=+253.550058261" watchObservedRunningTime="2025-10-02 11:14:13.00043153 +0000 UTC m=+253.550797894" Oct 02 11:14:14 crc kubenswrapper[4929]: I1002 11:14:14.996652 4929 generic.go:334] "Generic (PLEG): container finished" podID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerID="370768a472410f2d337544d58e6b3067b53ff82c58c7c926f98ffb8b87953ea7" exitCode=0 Oct 02 11:14:14 crc kubenswrapper[4929]: I1002 11:14:14.996737 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerDied","Data":"370768a472410f2d337544d58e6b3067b53ff82c58c7c926f98ffb8b87953ea7"} Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.428995 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.529541 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxrbv\" (UniqueName: \"kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv\") pod \"19ba1996-79cb-44bb-a285-5603d1fc649e\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.529603 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities\") pod \"19ba1996-79cb-44bb-a285-5603d1fc649e\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.529657 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content\") pod \"19ba1996-79cb-44bb-a285-5603d1fc649e\" (UID: \"19ba1996-79cb-44bb-a285-5603d1fc649e\") " Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.530945 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities" (OuterVolumeSpecName: "utilities") pod "19ba1996-79cb-44bb-a285-5603d1fc649e" (UID: "19ba1996-79cb-44bb-a285-5603d1fc649e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.534978 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv" (OuterVolumeSpecName: "kube-api-access-cxrbv") pod "19ba1996-79cb-44bb-a285-5603d1fc649e" (UID: "19ba1996-79cb-44bb-a285-5603d1fc649e"). InnerVolumeSpecName "kube-api-access-cxrbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.576497 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19ba1996-79cb-44bb-a285-5603d1fc649e" (UID: "19ba1996-79cb-44bb-a285-5603d1fc649e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.630566 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxrbv\" (UniqueName: \"kubernetes.io/projected/19ba1996-79cb-44bb-a285-5603d1fc649e-kube-api-access-cxrbv\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.630619 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4929]: I1002 11:14:15.630631 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19ba1996-79cb-44bb-a285-5603d1fc649e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.012315 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerStarted","Data":"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506"} Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.014300 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tx64j" event={"ID":"19ba1996-79cb-44bb-a285-5603d1fc649e","Type":"ContainerDied","Data":"251c782ecc42bf47904c5627f6d0aeb14d30eb1b8c27a8bb56a381c8936e889a"} Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.014315 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tx64j" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.014361 4929 scope.go:117] "RemoveContainer" containerID="370768a472410f2d337544d58e6b3067b53ff82c58c7c926f98ffb8b87953ea7" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.019179 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerStarted","Data":"b41468bdc5836bd46d12408ccc044acfc9abf9790daeb3d04332149e104624c3"} Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.021630 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerStarted","Data":"1995416d1c4ea2d186b15ddf37600b831443938a6289ea8feee53b3615290b29"} Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.026938 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerStarted","Data":"b61d6e2f502f5281736c4d3290fda2f8006439b411000c75359c7040690092c5"} Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.030052 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fddvw" podStartSLOduration=2.6346799340000002 podStartE2EDuration="1m39.030024434s" podCreationTimestamp="2025-10-02 11:12:37 +0000 UTC" firstStartedPulling="2025-10-02 11:12:39.204372684 +0000 UTC m=+159.754739048" lastFinishedPulling="2025-10-02 11:14:15.599717184 +0000 UTC m=+256.150083548" observedRunningTime="2025-10-02 11:14:16.027995132 +0000 UTC m=+256.578361506" watchObservedRunningTime="2025-10-02 11:14:16.030024434 +0000 UTC m=+256.580390798" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.035976 4929 scope.go:117] "RemoveContainer" containerID="80ab1bf356d9169cd1b4c521e219436de835fa17a0a3a6cf9f752acdf61b573b" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.048357 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkvt7" podStartSLOduration=2.76150758 podStartE2EDuration="1m37.048339835s" podCreationTimestamp="2025-10-02 11:12:39 +0000 UTC" firstStartedPulling="2025-10-02 11:12:41.305851054 +0000 UTC m=+161.856217418" lastFinishedPulling="2025-10-02 11:14:15.592683299 +0000 UTC m=+256.143049673" observedRunningTime="2025-10-02 11:14:16.04587249 +0000 UTC m=+256.596238854" watchObservedRunningTime="2025-10-02 11:14:16.048339835 +0000 UTC m=+256.598706199" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.062124 4929 scope.go:117] "RemoveContainer" containerID="da4620ad550e58f9236112b57ac2f3ebeeb8d5adfb38a9c88557ddf167508c23" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.070587 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8lgqp" podStartSLOduration=2.70902463 podStartE2EDuration="1m39.070575546s" podCreationTimestamp="2025-10-02 11:12:37 +0000 UTC" firstStartedPulling="2025-10-02 11:12:39.220866788 +0000 UTC m=+159.771233162" lastFinishedPulling="2025-10-02 11:14:15.582417724 +0000 UTC m=+256.132784078" observedRunningTime="2025-10-02 11:14:16.069642078 +0000 UTC m=+256.620008442" watchObservedRunningTime="2025-10-02 11:14:16.070575546 +0000 UTC m=+256.620941910" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.087658 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smh4r" podStartSLOduration=2.801814242 podStartE2EDuration="1m36.087646109s" podCreationTimestamp="2025-10-02 11:12:40 +0000 UTC" firstStartedPulling="2025-10-02 11:12:42.296319479 +0000 UTC m=+162.846685843" lastFinishedPulling="2025-10-02 11:14:15.582151346 +0000 UTC m=+256.132517710" observedRunningTime="2025-10-02 11:14:16.086298138 +0000 UTC m=+256.636664502" watchObservedRunningTime="2025-10-02 11:14:16.087646109 +0000 UTC m=+256.638012473" Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.110354 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.115997 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tx64j"] Oct 02 11:14:16 crc kubenswrapper[4929]: I1002 11:14:16.163208 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" path="/var/lib/kubelet/pods/19ba1996-79cb-44bb-a285-5603d1fc649e/volumes" Oct 02 11:14:17 crc kubenswrapper[4929]: I1002 11:14:17.750284 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:14:17 crc kubenswrapper[4929]: I1002 11:14:17.750854 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:14:17 crc kubenswrapper[4929]: I1002 11:14:17.895205 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:14:17 crc kubenswrapper[4929]: I1002 11:14:17.895251 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:14:17 crc kubenswrapper[4929]: I1002 11:14:17.939218 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:14:18 crc kubenswrapper[4929]: I1002 11:14:18.082671 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:14:18 crc kubenswrapper[4929]: I1002 11:14:18.104937 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:18 crc kubenswrapper[4929]: I1002 11:14:18.105004 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:18 crc kubenswrapper[4929]: I1002 11:14:18.150564 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:18 crc kubenswrapper[4929]: I1002 11:14:18.805541 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fddvw" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="registry-server" probeResult="failure" output=< Oct 02 11:14:18 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 11:14:18 crc kubenswrapper[4929]: > Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.092158 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.092440 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.134286 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.939043 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.939099 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:14:20 crc kubenswrapper[4929]: I1002 11:14:20.973399 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:14:21 crc kubenswrapper[4929]: I1002 11:14:21.092348 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:14:21 crc kubenswrapper[4929]: I1002 11:14:21.100840 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:21 crc kubenswrapper[4929]: I1002 11:14:21.303940 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:21 crc kubenswrapper[4929]: I1002 11:14:21.304061 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:21 crc kubenswrapper[4929]: I1002 11:14:21.351393 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:22 crc kubenswrapper[4929]: I1002 11:14:22.095011 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:24 crc kubenswrapper[4929]: I1002 11:14:24.499552 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:14:24 crc kubenswrapper[4929]: I1002 11:14:24.500213 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkvt7" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="registry-server" containerID="cri-o://1995416d1c4ea2d186b15ddf37600b831443938a6289ea8feee53b3615290b29" gracePeriod=2 Oct 02 11:14:24 crc kubenswrapper[4929]: I1002 11:14:24.700711 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:14:24 crc kubenswrapper[4929]: I1002 11:14:24.701107 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smh4r" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="registry-server" containerID="cri-o://b61d6e2f502f5281736c4d3290fda2f8006439b411000c75359c7040690092c5" gracePeriod=2 Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.081281 4929 generic.go:334] "Generic (PLEG): container finished" podID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerID="1995416d1c4ea2d186b15ddf37600b831443938a6289ea8feee53b3615290b29" exitCode=0 Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.081531 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerDied","Data":"1995416d1c4ea2d186b15ddf37600b831443938a6289ea8feee53b3615290b29"} Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.090837 4929 generic.go:334] "Generic (PLEG): container finished" podID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerID="b61d6e2f502f5281736c4d3290fda2f8006439b411000c75359c7040690092c5" exitCode=0 Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.090886 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerDied","Data":"b61d6e2f502f5281736c4d3290fda2f8006439b411000c75359c7040690092c5"} Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.243093 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.344688 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.356310 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities\") pod \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.356411 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlcfz\" (UniqueName: \"kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz\") pod \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.356475 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content\") pod \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\" (UID: \"ee96c3e5-aca5-4493-922c-72a05f3b6c93\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.357313 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities" (OuterVolumeSpecName: "utilities") pod "ee96c3e5-aca5-4493-922c-72a05f3b6c93" (UID: "ee96c3e5-aca5-4493-922c-72a05f3b6c93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.364235 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz" (OuterVolumeSpecName: "kube-api-access-dlcfz") pod "ee96c3e5-aca5-4493-922c-72a05f3b6c93" (UID: "ee96c3e5-aca5-4493-922c-72a05f3b6c93"). InnerVolumeSpecName "kube-api-access-dlcfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.371636 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee96c3e5-aca5-4493-922c-72a05f3b6c93" (UID: "ee96c3e5-aca5-4493-922c-72a05f3b6c93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.457703 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content\") pod \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.457837 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rq48\" (UniqueName: \"kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48\") pod \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.457886 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities\") pod \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\" (UID: \"b1257b93-1170-45a1-b46b-a6bb3c4a2bad\") " Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.458073 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlcfz\" (UniqueName: \"kubernetes.io/projected/ee96c3e5-aca5-4493-922c-72a05f3b6c93-kube-api-access-dlcfz\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.458085 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.458094 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee96c3e5-aca5-4493-922c-72a05f3b6c93-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.458808 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities" (OuterVolumeSpecName: "utilities") pod "b1257b93-1170-45a1-b46b-a6bb3c4a2bad" (UID: "b1257b93-1170-45a1-b46b-a6bb3c4a2bad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.462568 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48" (OuterVolumeSpecName: "kube-api-access-9rq48") pod "b1257b93-1170-45a1-b46b-a6bb3c4a2bad" (UID: "b1257b93-1170-45a1-b46b-a6bb3c4a2bad"). InnerVolumeSpecName "kube-api-access-9rq48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.553070 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1257b93-1170-45a1-b46b-a6bb3c4a2bad" (UID: "b1257b93-1170-45a1-b46b-a6bb3c4a2bad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.558807 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rq48\" (UniqueName: \"kubernetes.io/projected/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-kube-api-access-9rq48\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.558833 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:26 crc kubenswrapper[4929]: I1002 11:14:26.558842 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1257b93-1170-45a1-b46b-a6bb3c4a2bad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.102260 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkvt7" event={"ID":"ee96c3e5-aca5-4493-922c-72a05f3b6c93","Type":"ContainerDied","Data":"0910d14039067f108eba0e411042d98b5dadbae33b364ea072c6d4f2886a552d"} Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.102330 4929 scope.go:117] "RemoveContainer" containerID="1995416d1c4ea2d186b15ddf37600b831443938a6289ea8feee53b3615290b29" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.102482 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkvt7" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.117049 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smh4r" event={"ID":"b1257b93-1170-45a1-b46b-a6bb3c4a2bad","Type":"ContainerDied","Data":"f9f9f31ad5d348d5dbd710e89107827f0616e15a488bf73995ea733bdf1e23b4"} Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.117175 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smh4r" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.137131 4929 scope.go:117] "RemoveContainer" containerID="9c90d1baa4300103df57e46fa9cd934ea9b95e6c5ba2e0e0b0efe2d48db25983" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.144811 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.153926 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkvt7"] Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.160206 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.165814 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smh4r"] Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.175594 4929 scope.go:117] "RemoveContainer" containerID="ba59d9488f469693648b0e6b29e09e0810a950106f5881b50af741310a8f183d" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.197351 4929 scope.go:117] "RemoveContainer" containerID="b61d6e2f502f5281736c4d3290fda2f8006439b411000c75359c7040690092c5" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.215056 4929 scope.go:117] "RemoveContainer" containerID="50cbdf8863a91a563374f794f77d155e0e0e7fecfc94cd80e598b18c8521cc7b" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.234192 4929 scope.go:117] "RemoveContainer" containerID="5b46c16239990a924b60c6ca6a9fa834da0ba72fd2cb50eb2926e3e4628b32f6" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.799240 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:14:27 crc kubenswrapper[4929]: I1002 11:14:27.866364 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:14:28 crc kubenswrapper[4929]: I1002 11:14:28.164868 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" path="/var/lib/kubelet/pods/b1257b93-1170-45a1-b46b-a6bb3c4a2bad/volumes" Oct 02 11:14:28 crc kubenswrapper[4929]: I1002 11:14:28.166215 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" path="/var/lib/kubelet/pods/ee96c3e5-aca5-4493-922c-72a05f3b6c93/volumes" Oct 02 11:14:28 crc kubenswrapper[4929]: I1002 11:14:28.167207 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:30 crc kubenswrapper[4929]: I1002 11:14:30.896103 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:14:30 crc kubenswrapper[4929]: I1002 11:14:30.896568 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8lgqp" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="registry-server" containerID="cri-o://b41468bdc5836bd46d12408ccc044acfc9abf9790daeb3d04332149e104624c3" gracePeriod=2 Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.142637 4929 generic.go:334] "Generic (PLEG): container finished" podID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerID="b41468bdc5836bd46d12408ccc044acfc9abf9790daeb3d04332149e104624c3" exitCode=0 Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.142677 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerDied","Data":"b41468bdc5836bd46d12408ccc044acfc9abf9790daeb3d04332149e104624c3"} Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.222557 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.327807 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc88b\" (UniqueName: \"kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b\") pod \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.328050 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities\") pod \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.328103 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content\") pod \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\" (UID: \"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c\") " Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.329991 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities" (OuterVolumeSpecName: "utilities") pod "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" (UID: "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.333459 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b" (OuterVolumeSpecName: "kube-api-access-sc88b") pod "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" (UID: "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c"). InnerVolumeSpecName "kube-api-access-sc88b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.374678 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" (UID: "8e2fc666-f2c4-4d6c-b250-58213fc0dd0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.429528 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.429568 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc88b\" (UniqueName: \"kubernetes.io/projected/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-kube-api-access-sc88b\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:31 crc kubenswrapper[4929]: I1002 11:14:31.429583 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.149659 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lgqp" event={"ID":"8e2fc666-f2c4-4d6c-b250-58213fc0dd0c","Type":"ContainerDied","Data":"08e4807299c2366575315b560b3c9ff8fab2f9cb649f14210e340220da2e86a7"} Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.149721 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lgqp" Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.149731 4929 scope.go:117] "RemoveContainer" containerID="b41468bdc5836bd46d12408ccc044acfc9abf9790daeb3d04332149e104624c3" Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.170562 4929 scope.go:117] "RemoveContainer" containerID="8997a762674c93afbebcad8cc2765de54646bf0ed4622c4df8950ecef5a18ef8" Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.183932 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.187727 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8lgqp"] Oct 02 11:14:32 crc kubenswrapper[4929]: I1002 11:14:32.214389 4929 scope.go:117] "RemoveContainer" containerID="49ef16407cded8cf166396e5a4a5b0f12035ae410e91ca97662915c0320fa51b" Oct 02 11:14:34 crc kubenswrapper[4929]: I1002 11:14:34.162151 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" path="/var/lib/kubelet/pods/8e2fc666-f2c4-4d6c-b250-58213fc0dd0c/volumes" Oct 02 11:14:42 crc kubenswrapper[4929]: I1002 11:14:42.684775 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.146797 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh"] Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149628 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149656 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149674 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149684 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149709 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149729 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149747 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909f4df9-4b16-40ea-9671-1713cf023319" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149758 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="909f4df9-4b16-40ea-9671-1713cf023319" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149774 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9578a33-3b82-4b8b-8c9a-6701c0422602" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149782 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9578a33-3b82-4b8b-8c9a-6701c0422602" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149793 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149801 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149814 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149822 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149834 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149842 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149855 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149863 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149874 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149882 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149895 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149907 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149923 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149934 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.149950 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.149985 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="extract-utilities" Oct 02 11:15:00 crc kubenswrapper[4929]: E1002 11:15:00.150008 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150017 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="extract-content" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150163 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="909f4df9-4b16-40ea-9671-1713cf023319" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150176 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9578a33-3b82-4b8b-8c9a-6701c0422602" containerName="pruner" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150186 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2fc666-f2c4-4d6c-b250-58213fc0dd0c" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150197 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ba1996-79cb-44bb-a285-5603d1fc649e" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150208 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee96c3e5-aca5-4493-922c-72a05f3b6c93" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150226 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1257b93-1170-45a1-b46b-a6bb3c4a2bad" containerName="registry-server" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.150855 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.156422 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.156730 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.190889 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh"] Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.282858 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.282907 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvlq\" (UniqueName: \"kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.282997 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.384690 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.384796 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.384831 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvlq\" (UniqueName: \"kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.385722 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.391418 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.404770 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvlq\" (UniqueName: \"kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq\") pod \"collect-profiles-29323395-8v8gh\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.491289 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:00 crc kubenswrapper[4929]: I1002 11:15:00.672359 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh"] Oct 02 11:15:01 crc kubenswrapper[4929]: I1002 11:15:01.304334 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" event={"ID":"5a0fd470-ccd3-4178-8aec-422779131298","Type":"ContainerStarted","Data":"7e45bc0dedef809659788fe293fc565bb44e69320e939136a117b9f84cfa7731"} Oct 02 11:15:01 crc kubenswrapper[4929]: I1002 11:15:01.304812 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" event={"ID":"5a0fd470-ccd3-4178-8aec-422779131298","Type":"ContainerStarted","Data":"7c14b9281b7b5d90e737d5de4cfd8cf8ceb1160877f005d3212ee1ecf29cf22a"} Oct 02 11:15:01 crc kubenswrapper[4929]: I1002 11:15:01.328689 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" podStartSLOduration=1.328658272 podStartE2EDuration="1.328658272s" podCreationTimestamp="2025-10-02 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:01.325180945 +0000 UTC m=+301.875547319" watchObservedRunningTime="2025-10-02 11:15:01.328658272 +0000 UTC m=+301.879024636" Oct 02 11:15:02 crc kubenswrapper[4929]: I1002 11:15:02.313109 4929 generic.go:334] "Generic (PLEG): container finished" podID="5a0fd470-ccd3-4178-8aec-422779131298" containerID="7e45bc0dedef809659788fe293fc565bb44e69320e939136a117b9f84cfa7731" exitCode=0 Oct 02 11:15:02 crc kubenswrapper[4929]: I1002 11:15:02.313170 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" event={"ID":"5a0fd470-ccd3-4178-8aec-422779131298","Type":"ContainerDied","Data":"7e45bc0dedef809659788fe293fc565bb44e69320e939136a117b9f84cfa7731"} Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.568631 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.727322 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssvlq\" (UniqueName: \"kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq\") pod \"5a0fd470-ccd3-4178-8aec-422779131298\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.727570 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume\") pod \"5a0fd470-ccd3-4178-8aec-422779131298\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.727652 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume\") pod \"5a0fd470-ccd3-4178-8aec-422779131298\" (UID: \"5a0fd470-ccd3-4178-8aec-422779131298\") " Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.728887 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume" (OuterVolumeSpecName: "config-volume") pod "5a0fd470-ccd3-4178-8aec-422779131298" (UID: "5a0fd470-ccd3-4178-8aec-422779131298"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.738378 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5a0fd470-ccd3-4178-8aec-422779131298" (UID: "5a0fd470-ccd3-4178-8aec-422779131298"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.738439 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq" (OuterVolumeSpecName: "kube-api-access-ssvlq") pod "5a0fd470-ccd3-4178-8aec-422779131298" (UID: "5a0fd470-ccd3-4178-8aec-422779131298"). InnerVolumeSpecName "kube-api-access-ssvlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.829571 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssvlq\" (UniqueName: \"kubernetes.io/projected/5a0fd470-ccd3-4178-8aec-422779131298-kube-api-access-ssvlq\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.829704 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5a0fd470-ccd3-4178-8aec-422779131298-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4929]: I1002 11:15:03.829725 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a0fd470-ccd3-4178-8aec-422779131298-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:04 crc kubenswrapper[4929]: I1002 11:15:04.328676 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" event={"ID":"5a0fd470-ccd3-4178-8aec-422779131298","Type":"ContainerDied","Data":"7c14b9281b7b5d90e737d5de4cfd8cf8ceb1160877f005d3212ee1ecf29cf22a"} Oct 02 11:15:04 crc kubenswrapper[4929]: I1002 11:15:04.329309 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c14b9281b7b5d90e737d5de4cfd8cf8ceb1160877f005d3212ee1ecf29cf22a" Oct 02 11:15:04 crc kubenswrapper[4929]: I1002 11:15:04.328815 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh" Oct 02 11:15:07 crc kubenswrapper[4929]: I1002 11:15:07.722896 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerName="oauth-openshift" containerID="cri-o://4b1cd896ad216c083beb89bd7e20943ae0ac3510f6f80ff8577da254951c39fa" gracePeriod=15 Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.364087 4929 generic.go:334] "Generic (PLEG): container finished" podID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerID="4b1cd896ad216c083beb89bd7e20943ae0ac3510f6f80ff8577da254951c39fa" exitCode=0 Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.364128 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" event={"ID":"9a73326b-13cd-4b69-b17b-93cd1b59679c","Type":"ContainerDied","Data":"4b1cd896ad216c083beb89bd7e20943ae0ac3510f6f80ff8577da254951c39fa"} Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.585312 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.616277 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-js8jg"] Oct 02 11:15:08 crc kubenswrapper[4929]: E1002 11:15:08.616507 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0fd470-ccd3-4178-8aec-422779131298" containerName="collect-profiles" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.616521 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0fd470-ccd3-4178-8aec-422779131298" containerName="collect-profiles" Oct 02 11:15:08 crc kubenswrapper[4929]: E1002 11:15:08.616535 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerName="oauth-openshift" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.616545 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerName="oauth-openshift" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.616650 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0fd470-ccd3-4178-8aec-422779131298" containerName="collect-profiles" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.616665 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" containerName="oauth-openshift" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.617105 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.631113 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-js8jg"] Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.738333 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739168 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739243 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739261 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739301 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739317 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739358 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739377 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739402 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739450 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wslb8\" (UniqueName: \"kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739469 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739492 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739512 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739528 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca\") pod \"9a73326b-13cd-4b69-b17b-93cd1b59679c\" (UID: \"9a73326b-13cd-4b69-b17b-93cd1b59679c\") " Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739665 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knctn\" (UniqueName: \"kubernetes.io/projected/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-kube-api-access-knctn\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739686 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-dir\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739707 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739726 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739746 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739771 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-policies\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739766 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739820 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739844 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739868 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739890 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739917 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.739954 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740005 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740038 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740072 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740105 4929 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740115 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740208 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.740828 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.741190 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.744142 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.744584 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.744715 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8" (OuterVolumeSpecName: "kube-api-access-wslb8") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "kube-api-access-wslb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.745071 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.745258 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.745492 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.745780 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.745821 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.746039 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9a73326b-13cd-4b69-b17b-93cd1b59679c" (UID: "9a73326b-13cd-4b69-b17b-93cd1b59679c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841070 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841139 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-policies\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841163 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841186 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841211 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841234 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841264 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841293 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841324 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841343 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841374 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841401 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knctn\" (UniqueName: \"kubernetes.io/projected/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-kube-api-access-knctn\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841425 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-dir\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841446 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841495 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wslb8\" (UniqueName: \"kubernetes.io/projected/9a73326b-13cd-4b69-b17b-93cd1b59679c-kube-api-access-wslb8\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841510 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841523 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841537 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841551 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841565 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841578 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841592 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841607 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841619 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841633 4929 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9a73326b-13cd-4b69-b17b-93cd1b59679c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.841651 4929 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9a73326b-13cd-4b69-b17b-93cd1b59679c-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.842616 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.843323 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-dir\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.843440 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.844772 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.845358 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-audit-policies\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.845582 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.845918 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.847775 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.848526 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.848747 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.850041 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.851232 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.861702 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.865775 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knctn\" (UniqueName: \"kubernetes.io/projected/53450eb1-92e6-42d5-bdee-63d4fd8d9b9e-kube-api-access-knctn\") pod \"oauth-openshift-56c7c74f4-js8jg\" (UID: \"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:08 crc kubenswrapper[4929]: I1002 11:15:08.946067 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.130715 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-js8jg"] Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.372796 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.373138 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cf6n9" event={"ID":"9a73326b-13cd-4b69-b17b-93cd1b59679c","Type":"ContainerDied","Data":"be078ba360ac9d00252183dacd3bb1e75b8cc5af47a38a01c86a412b2d40c65d"} Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.373199 4929 scope.go:117] "RemoveContainer" containerID="4b1cd896ad216c083beb89bd7e20943ae0ac3510f6f80ff8577da254951c39fa" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.374738 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" event={"ID":"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e","Type":"ContainerStarted","Data":"f66d436c8a15d05d2b139e6a89708530ba89589a167aded067e5c310016d5ba4"} Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.374780 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" event={"ID":"53450eb1-92e6-42d5-bdee-63d4fd8d9b9e","Type":"ContainerStarted","Data":"3aaa45684db5bbf98dbf4e1151246c7b6fc796daa8c0a69908d0ed093e3c2cf6"} Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.374950 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.377372 4929 patch_prober.go:28] interesting pod/oauth-openshift-56c7c74f4-js8jg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.55:6443/healthz\": dial tcp 10.217.0.55:6443: connect: connection refused" start-of-body= Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.377458 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" podUID="53450eb1-92e6-42d5-bdee-63d4fd8d9b9e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.55:6443/healthz\": dial tcp 10.217.0.55:6443: connect: connection refused" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.397595 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" podStartSLOduration=27.397517956 podStartE2EDuration="27.397517956s" podCreationTimestamp="2025-10-02 11:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:09.397406303 +0000 UTC m=+309.947772667" watchObservedRunningTime="2025-10-02 11:15:09.397517956 +0000 UTC m=+309.947884320" Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.419620 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:15:09 crc kubenswrapper[4929]: I1002 11:15:09.424298 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cf6n9"] Oct 02 11:15:10 crc kubenswrapper[4929]: I1002 11:15:10.177915 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a73326b-13cd-4b69-b17b-93cd1b59679c" path="/var/lib/kubelet/pods/9a73326b-13cd-4b69-b17b-93cd1b59679c/volumes" Oct 02 11:15:10 crc kubenswrapper[4929]: I1002 11:15:10.400131 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-56c7c74f4-js8jg" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.073133 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.073816 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7nz9s" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="registry-server" containerID="cri-o://bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6" gracePeriod=30 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.095101 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.095427 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fddvw" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="registry-server" containerID="cri-o://a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506" gracePeriod=30 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.107740 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.108007 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" containerID="cri-o://7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997" gracePeriod=30 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.122301 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.122590 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-74dq9" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="registry-server" containerID="cri-o://abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e" gracePeriod=30 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.126315 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.126553 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4gjxg" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="registry-server" containerID="cri-o://ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c" gracePeriod=30 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.134200 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c4q96"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.135109 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.141419 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c4q96"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.180767 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p55v\" (UniqueName: \"kubernetes.io/projected/9d676ad8-1218-41d5-b194-aa15bf42d384-kube-api-access-7p55v\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.181160 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.181305 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.282197 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.282265 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.282295 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p55v\" (UniqueName: \"kubernetes.io/projected/9d676ad8-1218-41d5-b194-aa15bf42d384-kube-api-access-7p55v\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.283901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.289004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9d676ad8-1218-41d5-b194-aa15bf42d384-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.304520 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p55v\" (UniqueName: \"kubernetes.io/projected/9d676ad8-1218-41d5-b194-aa15bf42d384-kube-api-access-7p55v\") pod \"marketplace-operator-79b997595-c4q96\" (UID: \"9d676ad8-1218-41d5-b194-aa15bf42d384\") " pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.515431 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.520161 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.530840 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.547416 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.560048 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.565731 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.587176 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p5nw\" (UniqueName: \"kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw\") pod \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.587600 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxtjk\" (UniqueName: \"kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk\") pod \"78ee48b5-a924-4253-8309-cdff7355ec6d\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.587737 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content\") pod \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.587767 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities\") pod \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.587922 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities\") pod \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.588711 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities" (OuterVolumeSpecName: "utilities") pod "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" (UID: "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.589406 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities" (OuterVolumeSpecName: "utilities") pod "4e5b0e1b-b379-42e5-a7aa-56a1736771ed" (UID: "4e5b0e1b-b379-42e5-a7aa-56a1736771ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.589510 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca\") pod \"78ee48b5-a924-4253-8309-cdff7355ec6d\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590039 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content\") pod \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\" (UID: \"4e5b0e1b-b379-42e5-a7aa-56a1736771ed\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590132 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5z6g\" (UniqueName: \"kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g\") pod \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\" (UID: \"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590212 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content\") pod \"f6a1f6f5-57b7-40f5-a463-356426589a84\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590283 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxjn5\" (UniqueName: \"kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5\") pod \"f6a1f6f5-57b7-40f5-a463-356426589a84\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590359 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities\") pod \"46a28329-6450-46ac-b889-ec17f4aca6f2\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590450 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content\") pod \"46a28329-6450-46ac-b889-ec17f4aca6f2\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590535 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics\") pod \"78ee48b5-a924-4253-8309-cdff7355ec6d\" (UID: \"78ee48b5-a924-4253-8309-cdff7355ec6d\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590602 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67tgc\" (UniqueName: \"kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc\") pod \"46a28329-6450-46ac-b889-ec17f4aca6f2\" (UID: \"46a28329-6450-46ac-b889-ec17f4aca6f2\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590672 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities\") pod \"f6a1f6f5-57b7-40f5-a463-356426589a84\" (UID: \"f6a1f6f5-57b7-40f5-a463-356426589a84\") " Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.590914 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.591051 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.591716 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities" (OuterVolumeSpecName: "utilities") pod "46a28329-6450-46ac-b889-ec17f4aca6f2" (UID: "46a28329-6450-46ac-b889-ec17f4aca6f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.592176 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities" (OuterVolumeSpecName: "utilities") pod "f6a1f6f5-57b7-40f5-a463-356426589a84" (UID: "f6a1f6f5-57b7-40f5-a463-356426589a84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.595628 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "78ee48b5-a924-4253-8309-cdff7355ec6d" (UID: "78ee48b5-a924-4253-8309-cdff7355ec6d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.595785 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g" (OuterVolumeSpecName: "kube-api-access-h5z6g") pod "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" (UID: "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc"). InnerVolumeSpecName "kube-api-access-h5z6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.596282 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "78ee48b5-a924-4253-8309-cdff7355ec6d" (UID: "78ee48b5-a924-4253-8309-cdff7355ec6d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.596304 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk" (OuterVolumeSpecName: "kube-api-access-rxtjk") pod "78ee48b5-a924-4253-8309-cdff7355ec6d" (UID: "78ee48b5-a924-4253-8309-cdff7355ec6d"). InnerVolumeSpecName "kube-api-access-rxtjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.597406 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc" (OuterVolumeSpecName: "kube-api-access-67tgc") pod "46a28329-6450-46ac-b889-ec17f4aca6f2" (UID: "46a28329-6450-46ac-b889-ec17f4aca6f2"). InnerVolumeSpecName "kube-api-access-67tgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.615229 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5" (OuterVolumeSpecName: "kube-api-access-zxjn5") pod "f6a1f6f5-57b7-40f5-a463-356426589a84" (UID: "f6a1f6f5-57b7-40f5-a463-356426589a84"). InnerVolumeSpecName "kube-api-access-zxjn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.615763 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" (UID: "e8e362b3-7e54-4408-ad25-b2c32c0aa3bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.623209 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw" (OuterVolumeSpecName: "kube-api-access-9p5nw") pod "4e5b0e1b-b379-42e5-a7aa-56a1736771ed" (UID: "4e5b0e1b-b379-42e5-a7aa-56a1736771ed"). InnerVolumeSpecName "kube-api-access-9p5nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.631891 4929 generic.go:334] "Generic (PLEG): container finished" podID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerID="bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6" exitCode=0 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.631976 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7nz9s" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.631993 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerDied","Data":"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.632073 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7nz9s" event={"ID":"46a28329-6450-46ac-b889-ec17f4aca6f2","Type":"ContainerDied","Data":"ae7715b38fc729f6cb3a1587dd235d612b5472636eb0411a4028d4c843159bfd"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.632098 4929 scope.go:117] "RemoveContainer" containerID="bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.638300 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerID="ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c" exitCode=0 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.638342 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gjxg" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.638354 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerDied","Data":"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.638386 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gjxg" event={"ID":"f6a1f6f5-57b7-40f5-a463-356426589a84","Type":"ContainerDied","Data":"0d6e00751c439799d93ec4ca120bb307ea5a833a15eeb8406c6aa0267dd681e7"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.641431 4929 generic.go:334] "Generic (PLEG): container finished" podID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerID="a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506" exitCode=0 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.641474 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerDied","Data":"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.641485 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fddvw" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.641491 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fddvw" event={"ID":"4e5b0e1b-b379-42e5-a7aa-56a1736771ed","Type":"ContainerDied","Data":"9e9e16e41d8ed9f38a4e76512549395e6b35997fb632174d36050854ef6c5d0d"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.642781 4929 generic.go:334] "Generic (PLEG): container finished" podID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerID="7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997" exitCode=0 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.642821 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" event={"ID":"78ee48b5-a924-4253-8309-cdff7355ec6d","Type":"ContainerDied","Data":"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.642828 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.642835 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h4pzk" event={"ID":"78ee48b5-a924-4253-8309-cdff7355ec6d","Type":"ContainerDied","Data":"0365335113a7856626f812c1d4e964bc32e41e49c38672dd23b923aaed472789"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.645318 4929 generic.go:334] "Generic (PLEG): container finished" podID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerID="abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e" exitCode=0 Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.645360 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerDied","Data":"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.645436 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74dq9" event={"ID":"e8e362b3-7e54-4408-ad25-b2c32c0aa3bc","Type":"ContainerDied","Data":"289bc6ade4379244edb614dd188a4aa00b2ceb5c1b9e942112c42019b4d4c3fc"} Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.645375 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74dq9" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.672325 4929 scope.go:117] "RemoveContainer" containerID="cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.682856 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.686313 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h4pzk"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697584 4929 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697616 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67tgc\" (UniqueName: \"kubernetes.io/projected/46a28329-6450-46ac-b889-ec17f4aca6f2-kube-api-access-67tgc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697625 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697635 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p5nw\" (UniqueName: \"kubernetes.io/projected/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-kube-api-access-9p5nw\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697644 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxtjk\" (UniqueName: \"kubernetes.io/projected/78ee48b5-a924-4253-8309-cdff7355ec6d-kube-api-access-rxtjk\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697652 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697659 4929 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78ee48b5-a924-4253-8309-cdff7355ec6d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697667 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5z6g\" (UniqueName: \"kubernetes.io/projected/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc-kube-api-access-h5z6g\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697676 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxjn5\" (UniqueName: \"kubernetes.io/projected/f6a1f6f5-57b7-40f5-a463-356426589a84-kube-api-access-zxjn5\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697683 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.697837 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46a28329-6450-46ac-b889-ec17f4aca6f2" (UID: "46a28329-6450-46ac-b889-ec17f4aca6f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.698121 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e5b0e1b-b379-42e5-a7aa-56a1736771ed" (UID: "4e5b0e1b-b379-42e5-a7aa-56a1736771ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.700225 4929 scope.go:117] "RemoveContainer" containerID="c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.704359 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.707759 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-74dq9"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.715369 4929 scope.go:117] "RemoveContainer" containerID="bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.715989 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6\": container with ID starting with bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6 not found: ID does not exist" containerID="bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.716049 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6"} err="failed to get container status \"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6\": rpc error: code = NotFound desc = could not find container \"bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6\": container with ID starting with bc2a51a5e152b17a431a024454493b5f54cd42a6e787f2c4de5c68987b0f99b6 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.716092 4929 scope.go:117] "RemoveContainer" containerID="cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.716618 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96\": container with ID starting with cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96 not found: ID does not exist" containerID="cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.716721 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96"} err="failed to get container status \"cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96\": rpc error: code = NotFound desc = could not find container \"cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96\": container with ID starting with cd81578b2942eb99b27a688d8d54749646a20dbcf2c594cf3f4e5cd18eb38c96 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.716813 4929 scope.go:117] "RemoveContainer" containerID="c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.720147 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba\": container with ID starting with c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba not found: ID does not exist" containerID="c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.720198 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba"} err="failed to get container status \"c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba\": rpc error: code = NotFound desc = could not find container \"c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba\": container with ID starting with c0ed97d5abdb7c706a8846a9517ea0a384e432114a072379ac9d02ce23775eba not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.720243 4929 scope.go:117] "RemoveContainer" containerID="ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.736305 4929 scope.go:117] "RemoveContainer" containerID="53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.754323 4929 scope.go:117] "RemoveContainer" containerID="f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.759716 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6a1f6f5-57b7-40f5-a463-356426589a84" (UID: "f6a1f6f5-57b7-40f5-a463-356426589a84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.772335 4929 scope.go:117] "RemoveContainer" containerID="ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.772765 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c\": container with ID starting with ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c not found: ID does not exist" containerID="ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.772831 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c"} err="failed to get container status \"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c\": rpc error: code = NotFound desc = could not find container \"ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c\": container with ID starting with ad554a3146a1cd5bbc4618f4ca85e5ad93ed3c7418fa93746d8308808ddfab9c not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.772861 4929 scope.go:117] "RemoveContainer" containerID="53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.773426 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099\": container with ID starting with 53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099 not found: ID does not exist" containerID="53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.773448 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099"} err="failed to get container status \"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099\": rpc error: code = NotFound desc = could not find container \"53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099\": container with ID starting with 53bc59fee69b35719a9b4bedfafec3c6efed32f968da8c1d83c64f0cea2df099 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.773487 4929 scope.go:117] "RemoveContainer" containerID="f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.773747 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4\": container with ID starting with f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4 not found: ID does not exist" containerID="f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.773773 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4"} err="failed to get container status \"f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4\": rpc error: code = NotFound desc = could not find container \"f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4\": container with ID starting with f3c749d2463d51fb427b8d665d7d0d73cc9cc1d99357ee247775e97ff12a5ea4 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.773792 4929 scope.go:117] "RemoveContainer" containerID="a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.787463 4929 scope.go:117] "RemoveContainer" containerID="65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.805209 4929 scope.go:117] "RemoveContainer" containerID="86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.805530 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5b0e1b-b379-42e5-a7aa-56a1736771ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.805549 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6a1f6f5-57b7-40f5-a463-356426589a84-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.805558 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46a28329-6450-46ac-b889-ec17f4aca6f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.822932 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c4q96"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.823109 4929 scope.go:117] "RemoveContainer" containerID="a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.823527 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506\": container with ID starting with a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506 not found: ID does not exist" containerID="a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.823555 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506"} err="failed to get container status \"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506\": rpc error: code = NotFound desc = could not find container \"a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506\": container with ID starting with a9205d9ba38707bac1a37d6e000dd7df49bd84bed5bad9549b7d57cf65b9a506 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.823575 4929 scope.go:117] "RemoveContainer" containerID="65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.823848 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6\": container with ID starting with 65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6 not found: ID does not exist" containerID="65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.823868 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6"} err="failed to get container status \"65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6\": rpc error: code = NotFound desc = could not find container \"65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6\": container with ID starting with 65fe5ba4213020b3683930bcc2d105c0e207431dc61b3f5df428a9209fadadc6 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.823880 4929 scope.go:117] "RemoveContainer" containerID="86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.824104 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af\": container with ID starting with 86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af not found: ID does not exist" containerID="86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.824119 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af"} err="failed to get container status \"86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af\": rpc error: code = NotFound desc = could not find container \"86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af\": container with ID starting with 86c31359aa628f66f474b3e90d7797366c4cd334361d14d6547851db89cf96af not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.824133 4929 scope.go:117] "RemoveContainer" containerID="7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.837478 4929 scope.go:117] "RemoveContainer" containerID="7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.837783 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997\": container with ID starting with 7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997 not found: ID does not exist" containerID="7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.837821 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997"} err="failed to get container status \"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997\": rpc error: code = NotFound desc = could not find container \"7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997\": container with ID starting with 7fa5ff3dfe50008751e692b0eef9dbf3b3cfa176c910efe98969f62e08a42997 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.837847 4929 scope.go:117] "RemoveContainer" containerID="abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.859210 4929 scope.go:117] "RemoveContainer" containerID="bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.884673 4929 scope.go:117] "RemoveContainer" containerID="3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.907579 4929 scope.go:117] "RemoveContainer" containerID="abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.908114 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e\": container with ID starting with abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e not found: ID does not exist" containerID="abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.908163 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e"} err="failed to get container status \"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e\": rpc error: code = NotFound desc = could not find container \"abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e\": container with ID starting with abd1e8badebb933173cea215b36f19337d4b3cbc16b57c138e17ddae37c3d74e not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.908196 4929 scope.go:117] "RemoveContainer" containerID="bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.908518 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7\": container with ID starting with bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7 not found: ID does not exist" containerID="bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.908545 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7"} err="failed to get container status \"bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7\": rpc error: code = NotFound desc = could not find container \"bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7\": container with ID starting with bf039fab6fdf81cf5275541de6a5e88cfaf561ae0187e34bf5e35e8e14ea9ee7 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.908566 4929 scope.go:117] "RemoveContainer" containerID="3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7" Oct 02 11:15:44 crc kubenswrapper[4929]: E1002 11:15:44.908832 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7\": container with ID starting with 3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7 not found: ID does not exist" containerID="3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.908854 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7"} err="failed to get container status \"3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7\": rpc error: code = NotFound desc = could not find container \"3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7\": container with ID starting with 3ffaaab960bb238a76024bd59167a2baa496dd7d0cd4240809d59664ddaa82b7 not found: ID does not exist" Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.965209 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.972300 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7nz9s"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.983099 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.988436 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4gjxg"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.992812 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:15:44 crc kubenswrapper[4929]: I1002 11:15:44.995565 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fddvw"] Oct 02 11:15:45 crc kubenswrapper[4929]: I1002 11:15:45.653176 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" event={"ID":"9d676ad8-1218-41d5-b194-aa15bf42d384","Type":"ContainerStarted","Data":"03994b5d6bdb149f8297621eea0b1a300ba6ba343c9075744aa38bdb093e13f4"} Oct 02 11:15:45 crc kubenswrapper[4929]: I1002 11:15:45.653439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" event={"ID":"9d676ad8-1218-41d5-b194-aa15bf42d384","Type":"ContainerStarted","Data":"1b1e247e501fcc8846f1e1114a389d4bba561d76155c20c815b12477dc319765"} Oct 02 11:15:45 crc kubenswrapper[4929]: I1002 11:15:45.653457 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:45 crc kubenswrapper[4929]: I1002 11:15:45.656367 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" Oct 02 11:15:45 crc kubenswrapper[4929]: I1002 11:15:45.672271 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-c4q96" podStartSLOduration=1.672246962 podStartE2EDuration="1.672246962s" podCreationTimestamp="2025-10-02 11:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:45.66825845 +0000 UTC m=+346.218624814" watchObservedRunningTime="2025-10-02 11:15:45.672246962 +0000 UTC m=+346.222613356" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.163179 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" path="/var/lib/kubelet/pods/46a28329-6450-46ac-b889-ec17f4aca6f2/volumes" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.164076 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" path="/var/lib/kubelet/pods/4e5b0e1b-b379-42e5-a7aa-56a1736771ed/volumes" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.164924 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" path="/var/lib/kubelet/pods/78ee48b5-a924-4253-8309-cdff7355ec6d/volumes" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.166109 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" path="/var/lib/kubelet/pods/e8e362b3-7e54-4408-ad25-b2c32c0aa3bc/volumes" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.166881 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" path="/var/lib/kubelet/pods/f6a1f6f5-57b7-40f5-a463-356426589a84/volumes" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291565 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtt6"] Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291755 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291766 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291776 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291782 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291792 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291799 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291807 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291815 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291822 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291828 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291835 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291841 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291848 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291853 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291861 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291866 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291880 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291885 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291893 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291899 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291905 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291911 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291919 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291925 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="extract-content" Oct 02 11:15:46 crc kubenswrapper[4929]: E1002 11:15:46.291932 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.291938 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="extract-utilities" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.292051 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e362b3-7e54-4408-ad25-b2c32c0aa3bc" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.292065 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a28329-6450-46ac-b889-ec17f4aca6f2" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.292075 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5b0e1b-b379-42e5-a7aa-56a1736771ed" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.292082 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ee48b5-a924-4253-8309-cdff7355ec6d" containerName="marketplace-operator" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.292091 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a1f6f5-57b7-40f5-a463-356426589a84" containerName="registry-server" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.294005 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.296545 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.302283 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtt6"] Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.322514 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlcb\" (UniqueName: \"kubernetes.io/projected/01d6460b-931e-456e-ae3d-8b9216249c60-kube-api-access-wqlcb\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.322604 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-utilities\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.322624 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-catalog-content\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.423552 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-utilities\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.423621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-catalog-content\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.423658 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlcb\" (UniqueName: \"kubernetes.io/projected/01d6460b-931e-456e-ae3d-8b9216249c60-kube-api-access-wqlcb\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.424234 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-utilities\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.424251 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d6460b-931e-456e-ae3d-8b9216249c60-catalog-content\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.447317 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlcb\" (UniqueName: \"kubernetes.io/projected/01d6460b-931e-456e-ae3d-8b9216249c60-kube-api-access-wqlcb\") pod \"redhat-marketplace-rvtt6\" (UID: \"01d6460b-931e-456e-ae3d-8b9216249c60\") " pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.493322 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.496660 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.498876 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.506698 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.525209 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lwq8\" (UniqueName: \"kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.525299 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.525327 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.617846 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.626548 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.626617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.626664 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lwq8\" (UniqueName: \"kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.627116 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.627133 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.662621 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lwq8\" (UniqueName: \"kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8\") pod \"certified-operators-cn5md\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.812824 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtt6"] Oct 02 11:15:46 crc kubenswrapper[4929]: I1002 11:15:46.824884 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.013561 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 11:15:47 crc kubenswrapper[4929]: W1002 11:15:47.072565 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1e8738_5db8_4a58_961d_82f554c9f39b.slice/crio-07f45787636421552b86c7d1e0f367bcab23c9bb1b609295a717beddb3a20d7a WatchSource:0}: Error finding container 07f45787636421552b86c7d1e0f367bcab23c9bb1b609295a717beddb3a20d7a: Status 404 returned error can't find the container with id 07f45787636421552b86c7d1e0f367bcab23c9bb1b609295a717beddb3a20d7a Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.674833 4929 generic.go:334] "Generic (PLEG): container finished" podID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerID="acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c" exitCode=0 Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.674879 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerDied","Data":"acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c"} Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.675154 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerStarted","Data":"07f45787636421552b86c7d1e0f367bcab23c9bb1b609295a717beddb3a20d7a"} Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.677246 4929 generic.go:334] "Generic (PLEG): container finished" podID="01d6460b-931e-456e-ae3d-8b9216249c60" containerID="052b73cd234da086a08ecca512c9a8cac1882915ba3477ed0616c2af0c5f1570" exitCode=0 Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.677444 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtt6" event={"ID":"01d6460b-931e-456e-ae3d-8b9216249c60","Type":"ContainerDied","Data":"052b73cd234da086a08ecca512c9a8cac1882915ba3477ed0616c2af0c5f1570"} Oct 02 11:15:47 crc kubenswrapper[4929]: I1002 11:15:47.678810 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtt6" event={"ID":"01d6460b-931e-456e-ae3d-8b9216249c60","Type":"ContainerStarted","Data":"7c4c4b0b627fecd4fb454be28b465022b801a899b00235a3a1fe99f2399ff08f"} Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.690981 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.692139 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.693882 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.703794 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.853130 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.853245 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2t6\" (UniqueName: \"kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.853325 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.888751 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r2mws"] Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.892425 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.894372 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.899563 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2mws"] Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.954739 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.954835 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.954878 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2t6\" (UniqueName: \"kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.955248 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.956046 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:48 crc kubenswrapper[4929]: I1002 11:15:48.974434 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2t6\" (UniqueName: \"kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6\") pod \"redhat-operators-z57r9\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.017361 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.055753 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-catalog-content\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.055848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfns9\" (UniqueName: \"kubernetes.io/projected/8107f56e-8ec5-4eee-a71a-49d929d35a2d-kube-api-access-zfns9\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.055942 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-utilities\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.157738 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfns9\" (UniqueName: \"kubernetes.io/projected/8107f56e-8ec5-4eee-a71a-49d929d35a2d-kube-api-access-zfns9\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.158227 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-utilities\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.158330 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-catalog-content\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.158997 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-catalog-content\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.159194 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8107f56e-8ec5-4eee-a71a-49d929d35a2d-utilities\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.175701 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfns9\" (UniqueName: \"kubernetes.io/projected/8107f56e-8ec5-4eee-a71a-49d929d35a2d-kube-api-access-zfns9\") pod \"community-operators-r2mws\" (UID: \"8107f56e-8ec5-4eee-a71a-49d929d35a2d\") " pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.211279 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.253543 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 11:15:49 crc kubenswrapper[4929]: W1002 11:15:49.273271 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b01c90_c88b_4218_9287_e4f5df0e2677.slice/crio-e5b033259894be4d4feef6deafe5c95b458d308b11c2335b07add7030e5ec891 WatchSource:0}: Error finding container e5b033259894be4d4feef6deafe5c95b458d308b11c2335b07add7030e5ec891: Status 404 returned error can't find the container with id e5b033259894be4d4feef6deafe5c95b458d308b11c2335b07add7030e5ec891 Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.423928 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2mws"] Oct 02 11:15:49 crc kubenswrapper[4929]: W1002 11:15:49.425613 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8107f56e_8ec5_4eee_a71a_49d929d35a2d.slice/crio-dab2fdd23737f7437ca27d2ff2611ebd761e28d093eee6eb424ac968d0ced4b6 WatchSource:0}: Error finding container dab2fdd23737f7437ca27d2ff2611ebd761e28d093eee6eb424ac968d0ced4b6: Status 404 returned error can't find the container with id dab2fdd23737f7437ca27d2ff2611ebd761e28d093eee6eb424ac968d0ced4b6 Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.694399 4929 generic.go:334] "Generic (PLEG): container finished" podID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerID="8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18" exitCode=0 Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.694444 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerDied","Data":"8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18"} Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.694480 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerStarted","Data":"e5b033259894be4d4feef6deafe5c95b458d308b11c2335b07add7030e5ec891"} Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.696751 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerStarted","Data":"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0"} Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.699062 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2mws" event={"ID":"8107f56e-8ec5-4eee-a71a-49d929d35a2d","Type":"ContainerStarted","Data":"dab2fdd23737f7437ca27d2ff2611ebd761e28d093eee6eb424ac968d0ced4b6"} Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.701547 4929 generic.go:334] "Generic (PLEG): container finished" podID="01d6460b-931e-456e-ae3d-8b9216249c60" containerID="20c444b8df89e399451363a037b4fb6605c87f81caec8d7573209c1612c162b4" exitCode=0 Oct 02 11:15:49 crc kubenswrapper[4929]: I1002 11:15:49.701582 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtt6" event={"ID":"01d6460b-931e-456e-ae3d-8b9216249c60","Type":"ContainerDied","Data":"20c444b8df89e399451363a037b4fb6605c87f81caec8d7573209c1612c162b4"} Oct 02 11:15:50 crc kubenswrapper[4929]: I1002 11:15:50.707768 4929 generic.go:334] "Generic (PLEG): container finished" podID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerID="0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0" exitCode=0 Oct 02 11:15:50 crc kubenswrapper[4929]: I1002 11:15:50.707874 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerDied","Data":"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0"} Oct 02 11:15:50 crc kubenswrapper[4929]: I1002 11:15:50.711486 4929 generic.go:334] "Generic (PLEG): container finished" podID="8107f56e-8ec5-4eee-a71a-49d929d35a2d" containerID="e8c7e41ea56df2bd93d777716dddc516fabbb56e708a9623d44536fb86e6f4c6" exitCode=0 Oct 02 11:15:50 crc kubenswrapper[4929]: I1002 11:15:50.711537 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2mws" event={"ID":"8107f56e-8ec5-4eee-a71a-49d929d35a2d","Type":"ContainerDied","Data":"e8c7e41ea56df2bd93d777716dddc516fabbb56e708a9623d44536fb86e6f4c6"} Oct 02 11:15:52 crc kubenswrapper[4929]: I1002 11:15:52.722894 4929 generic.go:334] "Generic (PLEG): container finished" podID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerID="99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba" exitCode=0 Oct 02 11:15:52 crc kubenswrapper[4929]: I1002 11:15:52.723018 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerDied","Data":"99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba"} Oct 02 11:15:53 crc kubenswrapper[4929]: I1002 11:15:53.734915 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtt6" event={"ID":"01d6460b-931e-456e-ae3d-8b9216249c60","Type":"ContainerStarted","Data":"84c9c1441252b415e964f6bca8e110f2464f3a37ae501ee24340bd8957f9c8cc"} Oct 02 11:15:53 crc kubenswrapper[4929]: I1002 11:15:53.756803 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvtt6" podStartSLOduration=3.251735899 podStartE2EDuration="7.756785362s" podCreationTimestamp="2025-10-02 11:15:46 +0000 UTC" firstStartedPulling="2025-10-02 11:15:47.678582976 +0000 UTC m=+348.228949330" lastFinishedPulling="2025-10-02 11:15:52.183632429 +0000 UTC m=+352.733998793" observedRunningTime="2025-10-02 11:15:53.753719608 +0000 UTC m=+354.304085972" watchObservedRunningTime="2025-10-02 11:15:53.756785362 +0000 UTC m=+354.307151736" Oct 02 11:15:54 crc kubenswrapper[4929]: I1002 11:15:54.741514 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerStarted","Data":"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2"} Oct 02 11:15:54 crc kubenswrapper[4929]: I1002 11:15:54.756500 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cn5md" podStartSLOduration=2.72490166 podStartE2EDuration="8.756480908s" podCreationTimestamp="2025-10-02 11:15:46 +0000 UTC" firstStartedPulling="2025-10-02 11:15:47.67642839 +0000 UTC m=+348.226794754" lastFinishedPulling="2025-10-02 11:15:53.708007638 +0000 UTC m=+354.258374002" observedRunningTime="2025-10-02 11:15:54.754601031 +0000 UTC m=+355.304967405" watchObservedRunningTime="2025-10-02 11:15:54.756480908 +0000 UTC m=+355.306847272" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.618660 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.619018 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.676697 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.825977 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.826025 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:56 crc kubenswrapper[4929]: I1002 11:15:56.862755 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:15:57 crc kubenswrapper[4929]: I1002 11:15:57.757534 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerStarted","Data":"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892"} Oct 02 11:15:57 crc kubenswrapper[4929]: I1002 11:15:57.760156 4929 generic.go:334] "Generic (PLEG): container finished" podID="8107f56e-8ec5-4eee-a71a-49d929d35a2d" containerID="7bebef163875a0d8f7aee24902e1867208f51879f1fe9d5518f4faa277759178" exitCode=0 Oct 02 11:15:57 crc kubenswrapper[4929]: I1002 11:15:57.760217 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2mws" event={"ID":"8107f56e-8ec5-4eee-a71a-49d929d35a2d","Type":"ContainerDied","Data":"7bebef163875a0d8f7aee24902e1867208f51879f1fe9d5518f4faa277759178"} Oct 02 11:15:57 crc kubenswrapper[4929]: I1002 11:15:57.785076 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z57r9" podStartSLOduration=2.849440156 podStartE2EDuration="9.785033048s" podCreationTimestamp="2025-10-02 11:15:48 +0000 UTC" firstStartedPulling="2025-10-02 11:15:49.696238088 +0000 UTC m=+350.246604492" lastFinishedPulling="2025-10-02 11:15:56.63183102 +0000 UTC m=+357.182197384" observedRunningTime="2025-10-02 11:15:57.779367985 +0000 UTC m=+358.329734349" watchObservedRunningTime="2025-10-02 11:15:57.785033048 +0000 UTC m=+358.335399412" Oct 02 11:15:59 crc kubenswrapper[4929]: I1002 11:15:59.018255 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:15:59 crc kubenswrapper[4929]: I1002 11:15:59.018302 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:16:00 crc kubenswrapper[4929]: I1002 11:16:00.063223 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z57r9" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:00 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:00 crc kubenswrapper[4929]: > Oct 02 11:16:02 crc kubenswrapper[4929]: I1002 11:16:02.787179 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2mws" event={"ID":"8107f56e-8ec5-4eee-a71a-49d929d35a2d","Type":"ContainerStarted","Data":"2fb00c68e6ce7ba3d69c60dad842650924595c15a4f0f1312440ca491a993675"} Oct 02 11:16:02 crc kubenswrapper[4929]: I1002 11:16:02.803083 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r2mws" podStartSLOduration=3.479590641 podStartE2EDuration="14.803069587s" podCreationTimestamp="2025-10-02 11:15:48 +0000 UTC" firstStartedPulling="2025-10-02 11:15:50.712943774 +0000 UTC m=+351.263310148" lastFinishedPulling="2025-10-02 11:16:02.03642269 +0000 UTC m=+362.586789094" observedRunningTime="2025-10-02 11:16:02.801667284 +0000 UTC m=+363.352033658" watchObservedRunningTime="2025-10-02 11:16:02.803069587 +0000 UTC m=+363.353435951" Oct 02 11:16:06 crc kubenswrapper[4929]: I1002 11:16:06.689106 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvtt6" Oct 02 11:16:06 crc kubenswrapper[4929]: I1002 11:16:06.867096 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.070330 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.121841 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.212257 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.212312 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.258371 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:16:09 crc kubenswrapper[4929]: I1002 11:16:09.866765 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r2mws" Oct 02 11:16:14 crc kubenswrapper[4929]: I1002 11:16:14.737349 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:14 crc kubenswrapper[4929]: I1002 11:16:14.737415 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:16:44 crc kubenswrapper[4929]: I1002 11:16:44.737225 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:44 crc kubenswrapper[4929]: I1002 11:16:44.737651 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:17:14 crc kubenswrapper[4929]: I1002 11:17:14.736530 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:17:14 crc kubenswrapper[4929]: I1002 11:17:14.738110 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:17:14 crc kubenswrapper[4929]: I1002 11:17:14.738194 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:17:14 crc kubenswrapper[4929]: I1002 11:17:14.738870 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:17:14 crc kubenswrapper[4929]: I1002 11:17:14.738940 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee" gracePeriod=600 Oct 02 11:17:15 crc kubenswrapper[4929]: I1002 11:17:15.219945 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee" exitCode=0 Oct 02 11:17:15 crc kubenswrapper[4929]: I1002 11:17:15.220258 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee"} Oct 02 11:17:15 crc kubenswrapper[4929]: I1002 11:17:15.220285 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c"} Oct 02 11:17:15 crc kubenswrapper[4929]: I1002 11:17:15.220303 4929 scope.go:117] "RemoveContainer" containerID="c56c80fb9f3926f605c04b78742318b924679e13bc5ceb9834e23994b17b0512" Oct 02 11:17:39 crc kubenswrapper[4929]: I1002 11:17:39.892974 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77plf"] Oct 02 11:17:39 crc kubenswrapper[4929]: I1002 11:17:39.894758 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:39 crc kubenswrapper[4929]: I1002 11:17:39.910931 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77plf"] Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.041905 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-trusted-ca\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042064 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-bound-sa-token\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042123 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-tls\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042165 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a42d9c2-c5b4-400c-a4da-c89943204f90-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042227 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a42d9c2-c5b4-400c-a4da-c89943204f90-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042333 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.042363 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-certificates\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.043238 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvfmv\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-kube-api-access-pvfmv\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.066678 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.144379 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-certificates\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.144437 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvfmv\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-kube-api-access-pvfmv\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.144478 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-trusted-ca\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.144504 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-bound-sa-token\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.145162 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-tls\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.145255 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a42d9c2-c5b4-400c-a4da-c89943204f90-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.145294 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a42d9c2-c5b4-400c-a4da-c89943204f90-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.145993 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a42d9c2-c5b4-400c-a4da-c89943204f90-ca-trust-extracted\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.146212 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-trusted-ca\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.147410 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-certificates\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.154278 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a42d9c2-c5b4-400c-a4da-c89943204f90-installation-pull-secrets\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.154510 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-registry-tls\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.161939 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvfmv\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-kube-api-access-pvfmv\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.164863 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a42d9c2-c5b4-400c-a4da-c89943204f90-bound-sa-token\") pod \"image-registry-66df7c8f76-77plf\" (UID: \"5a42d9c2-c5b4-400c-a4da-c89943204f90\") " pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.221696 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:17:40 crc kubenswrapper[4929]: I1002 11:17:40.426121 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-77plf"] Oct 02 11:17:41 crc kubenswrapper[4929]: I1002 11:17:41.387867 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" event={"ID":"5a42d9c2-c5b4-400c-a4da-c89943204f90","Type":"ContainerStarted","Data":"1ca36a9f366d3d9c9a6a96e1346139fe1b61752a74f5fae816a5efb2c47c1078"} Oct 02 11:17:41 crc kubenswrapper[4929]: I1002 11:17:41.389723 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" event={"ID":"5a42d9c2-c5b4-400c-a4da-c89943204f90","Type":"ContainerStarted","Data":"b7ff2b393eb19f3246e83d537470e667538a6584bcb6ca008ec742d09daf18e6"} Oct 02 11:17:41 crc kubenswrapper[4929]: I1002 11:17:41.389865 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:18:00 crc kubenswrapper[4929]: I1002 11:18:00.227239 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" Oct 02 11:18:00 crc kubenswrapper[4929]: I1002 11:18:00.253751 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-77plf" podStartSLOduration=21.253718256 podStartE2EDuration="21.253718256s" podCreationTimestamp="2025-10-02 11:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:41.417769631 +0000 UTC m=+461.968136035" watchObservedRunningTime="2025-10-02 11:18:00.253718256 +0000 UTC m=+480.804084680" Oct 02 11:18:00 crc kubenswrapper[4929]: I1002 11:18:00.287539 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:18:25 crc kubenswrapper[4929]: I1002 11:18:25.334240 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" podUID="94158d82-3849-4716-a7a8-61b0c6236d1e" containerName="registry" containerID="cri-o://9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6" gracePeriod=30 Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.344757 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440532 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440578 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440610 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440757 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440819 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440851 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440888 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p5hr\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.440908 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted\") pod \"94158d82-3849-4716-a7a8-61b0c6236d1e\" (UID: \"94158d82-3849-4716-a7a8-61b0c6236d1e\") " Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.441589 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.443118 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.447217 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.447257 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr" (OuterVolumeSpecName: "kube-api-access-9p5hr") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "kube-api-access-9p5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.447285 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.447286 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.451888 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.456094 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "94158d82-3849-4716-a7a8-61b0c6236d1e" (UID: "94158d82-3849-4716-a7a8-61b0c6236d1e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542186 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542232 4929 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/94158d82-3849-4716-a7a8-61b0c6236d1e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542244 4929 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542254 4929 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542265 4929 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542273 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p5hr\" (UniqueName: \"kubernetes.io/projected/94158d82-3849-4716-a7a8-61b0c6236d1e-kube-api-access-9p5hr\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.542281 4929 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/94158d82-3849-4716-a7a8-61b0c6236d1e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.679469 4929 generic.go:334] "Generic (PLEG): container finished" podID="94158d82-3849-4716-a7a8-61b0c6236d1e" containerID="9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6" exitCode=0 Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.679512 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" event={"ID":"94158d82-3849-4716-a7a8-61b0c6236d1e","Type":"ContainerDied","Data":"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6"} Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.679538 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" event={"ID":"94158d82-3849-4716-a7a8-61b0c6236d1e","Type":"ContainerDied","Data":"7c2b236d0fcbbce94c99cd76a4a34d278b10cf04fc4532f90abcab48645b6ca1"} Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.679554 4929 scope.go:117] "RemoveContainer" containerID="9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.679655 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r7lmd" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.698111 4929 scope.go:117] "RemoveContainer" containerID="9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6" Oct 02 11:18:26 crc kubenswrapper[4929]: E1002 11:18:26.699535 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6\": container with ID starting with 9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6 not found: ID does not exist" containerID="9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.699571 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6"} err="failed to get container status \"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6\": rpc error: code = NotFound desc = could not find container \"9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6\": container with ID starting with 9de9c36de8edab33fa85458b52ae5cc2c72b821fdddb8785cb3489d1f8f5c0d6 not found: ID does not exist" Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.714601 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:18:26 crc kubenswrapper[4929]: I1002 11:18:26.717974 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r7lmd"] Oct 02 11:18:28 crc kubenswrapper[4929]: I1002 11:18:28.166753 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94158d82-3849-4716-a7a8-61b0c6236d1e" path="/var/lib/kubelet/pods/94158d82-3849-4716-a7a8-61b0c6236d1e/volumes" Oct 02 11:19:44 crc kubenswrapper[4929]: I1002 11:19:44.736913 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:19:44 crc kubenswrapper[4929]: I1002 11:19:44.737619 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:20:14 crc kubenswrapper[4929]: I1002 11:20:14.737298 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:20:14 crc kubenswrapper[4929]: I1002 11:20:14.737819 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:20:44 crc kubenswrapper[4929]: I1002 11:20:44.736744 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:20:44 crc kubenswrapper[4929]: I1002 11:20:44.737192 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:20:44 crc kubenswrapper[4929]: I1002 11:20:44.737234 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:20:44 crc kubenswrapper[4929]: I1002 11:20:44.737679 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:20:44 crc kubenswrapper[4929]: I1002 11:20:44.737722 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c" gracePeriod=600 Oct 02 11:20:45 crc kubenswrapper[4929]: I1002 11:20:45.454423 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c" exitCode=0 Oct 02 11:20:45 crc kubenswrapper[4929]: I1002 11:20:45.454488 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c"} Oct 02 11:20:45 crc kubenswrapper[4929]: I1002 11:20:45.455097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0"} Oct 02 11:20:45 crc kubenswrapper[4929]: I1002 11:20:45.455127 4929 scope.go:117] "RemoveContainer" containerID="b9925400fc03b2fde5e1ec0f965efa614c075283fd17290328742e2d1ea8b1ee" Oct 02 11:22:34 crc kubenswrapper[4929]: I1002 11:22:34.923743 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:22:34 crc kubenswrapper[4929]: I1002 11:22:34.924544 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" containerID="cri-o://5db901c0d884a853b742f6c27c662acf331b4d85c88bc14e8c1301990d04efe8" gracePeriod=30 Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.042166 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.042604 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerName="route-controller-manager" containerID="cri-o://655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1" gracePeriod=30 Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.132398 4929 generic.go:334] "Generic (PLEG): container finished" podID="923649eb-ddfc-4299-b94e-3f549a863233" containerID="5db901c0d884a853b742f6c27c662acf331b4d85c88bc14e8c1301990d04efe8" exitCode=0 Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.132455 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" event={"ID":"923649eb-ddfc-4299-b94e-3f549a863233","Type":"ContainerDied","Data":"5db901c0d884a853b742f6c27c662acf331b4d85c88bc14e8c1301990d04efe8"} Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.290313 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.381740 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.479529 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config\") pod \"923649eb-ddfc-4299-b94e-3f549a863233\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.479580 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert\") pod \"923649eb-ddfc-4299-b94e-3f549a863233\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.479613 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca\") pod \"923649eb-ddfc-4299-b94e-3f549a863233\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.479647 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") pod \"923649eb-ddfc-4299-b94e-3f549a863233\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.479684 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhnc2\" (UniqueName: \"kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2\") pod \"923649eb-ddfc-4299-b94e-3f549a863233\" (UID: \"923649eb-ddfc-4299-b94e-3f549a863233\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.480535 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "923649eb-ddfc-4299-b94e-3f549a863233" (UID: "923649eb-ddfc-4299-b94e-3f549a863233"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.480565 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca" (OuterVolumeSpecName: "client-ca") pod "923649eb-ddfc-4299-b94e-3f549a863233" (UID: "923649eb-ddfc-4299-b94e-3f549a863233"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.480872 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config" (OuterVolumeSpecName: "config") pod "923649eb-ddfc-4299-b94e-3f549a863233" (UID: "923649eb-ddfc-4299-b94e-3f549a863233"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.484718 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2" (OuterVolumeSpecName: "kube-api-access-zhnc2") pod "923649eb-ddfc-4299-b94e-3f549a863233" (UID: "923649eb-ddfc-4299-b94e-3f549a863233"). InnerVolumeSpecName "kube-api-access-zhnc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.485517 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "923649eb-ddfc-4299-b94e-3f549a863233" (UID: "923649eb-ddfc-4299-b94e-3f549a863233"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581146 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config\") pod \"c619b40e-e812-4689-aa5d-3ad89ec57afc\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581232 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkg6d\" (UniqueName: \"kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d\") pod \"c619b40e-e812-4689-aa5d-3ad89ec57afc\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581268 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert\") pod \"c619b40e-e812-4689-aa5d-3ad89ec57afc\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581293 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca\") pod \"c619b40e-e812-4689-aa5d-3ad89ec57afc\" (UID: \"c619b40e-e812-4689-aa5d-3ad89ec57afc\") " Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581496 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581506 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/923649eb-ddfc-4299-b94e-3f549a863233-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581517 4929 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581526 4929 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/923649eb-ddfc-4299-b94e-3f549a863233-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.581534 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhnc2\" (UniqueName: \"kubernetes.io/projected/923649eb-ddfc-4299-b94e-3f549a863233-kube-api-access-zhnc2\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.582210 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca" (OuterVolumeSpecName: "client-ca") pod "c619b40e-e812-4689-aa5d-3ad89ec57afc" (UID: "c619b40e-e812-4689-aa5d-3ad89ec57afc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.582819 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config" (OuterVolumeSpecName: "config") pod "c619b40e-e812-4689-aa5d-3ad89ec57afc" (UID: "c619b40e-e812-4689-aa5d-3ad89ec57afc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.584902 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d" (OuterVolumeSpecName: "kube-api-access-pkg6d") pod "c619b40e-e812-4689-aa5d-3ad89ec57afc" (UID: "c619b40e-e812-4689-aa5d-3ad89ec57afc"). InnerVolumeSpecName "kube-api-access-pkg6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.586192 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c619b40e-e812-4689-aa5d-3ad89ec57afc" (UID: "c619b40e-e812-4689-aa5d-3ad89ec57afc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.682919 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkg6d\" (UniqueName: \"kubernetes.io/projected/c619b40e-e812-4689-aa5d-3ad89ec57afc-kube-api-access-pkg6d\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.682976 4929 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c619b40e-e812-4689-aa5d-3ad89ec57afc-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.682988 4929 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:35 crc kubenswrapper[4929]: I1002 11:22:35.683016 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c619b40e-e812-4689-aa5d-3ad89ec57afc-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.138075 4929 generic.go:334] "Generic (PLEG): container finished" podID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerID="655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1" exitCode=0 Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.138184 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" event={"ID":"c619b40e-e812-4689-aa5d-3ad89ec57afc","Type":"ContainerDied","Data":"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1"} Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.138251 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.138636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7" event={"ID":"c619b40e-e812-4689-aa5d-3ad89ec57afc","Type":"ContainerDied","Data":"aa7ca4073e16e332a9351a2f6a06bc27dc8abbe32c130e16f3fd1d3d4bdfa926"} Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.138669 4929 scope.go:117] "RemoveContainer" containerID="655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.140094 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" event={"ID":"923649eb-ddfc-4299-b94e-3f549a863233","Type":"ContainerDied","Data":"ae8df8b7e7a1e7dacb97f1be2c880ce214b109a10505a76dca6de6c9a99df242"} Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.140190 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7wfcz" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.153420 4929 scope.go:117] "RemoveContainer" containerID="655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1" Oct 02 11:22:36 crc kubenswrapper[4929]: E1002 11:22:36.155498 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1\": container with ID starting with 655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1 not found: ID does not exist" containerID="655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.155532 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1"} err="failed to get container status \"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1\": rpc error: code = NotFound desc = could not find container \"655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1\": container with ID starting with 655c1be1716bbe2a18cc3d40c7de032d8394775f8d79d1ea94e8f7e235a2bcd1 not found: ID does not exist" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.155554 4929 scope.go:117] "RemoveContainer" containerID="5db901c0d884a853b742f6c27c662acf331b4d85c88bc14e8c1301990d04efe8" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.170093 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.172266 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l9jm7"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.187530 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.191809 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7wfcz"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.653908 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b689b585d-5n24c"] Oct 02 11:22:36 crc kubenswrapper[4929]: E1002 11:22:36.654187 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654201 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: E1002 11:22:36.654213 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94158d82-3849-4716-a7a8-61b0c6236d1e" containerName="registry" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654220 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="94158d82-3849-4716-a7a8-61b0c6236d1e" containerName="registry" Oct 02 11:22:36 crc kubenswrapper[4929]: E1002 11:22:36.654232 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerName="route-controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654238 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerName="route-controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654323 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="923649eb-ddfc-4299-b94e-3f549a863233" containerName="controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654336 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" containerName="route-controller-manager" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654347 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="94158d82-3849-4716-a7a8-61b0c6236d1e" containerName="registry" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.654721 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.656413 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.660987 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.661236 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.661237 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.661414 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.661571 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.661655 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.662032 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.667523 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.667540 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.667650 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.667723 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.668354 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.668526 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.670797 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b689b585d-5n24c"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.676503 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.684234 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4"] Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.696939 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-config\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697003 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj6j8\" (UniqueName: \"kubernetes.io/projected/adb74f0b-39d9-40c0-842f-b2b4fabfc830-kube-api-access-gj6j8\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697048 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-client-ca\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697082 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-proxy-ca-bundles\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697112 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee9868b-274a-424f-b726-d9b03541db08-serving-cert\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697179 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29p6\" (UniqueName: \"kubernetes.io/projected/aee9868b-274a-424f-b726-d9b03541db08-kube-api-access-w29p6\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697217 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-client-ca\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697276 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb74f0b-39d9-40c0-842f-b2b4fabfc830-serving-cert\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.697303 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-config\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797599 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-proxy-ca-bundles\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797637 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee9868b-274a-424f-b726-d9b03541db08-serving-cert\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29p6\" (UniqueName: \"kubernetes.io/projected/aee9868b-274a-424f-b726-d9b03541db08-kube-api-access-w29p6\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797678 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-client-ca\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797714 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb74f0b-39d9-40c0-842f-b2b4fabfc830-serving-cert\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797732 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-config\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797763 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-config\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797782 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj6j8\" (UniqueName: \"kubernetes.io/projected/adb74f0b-39d9-40c0-842f-b2b4fabfc830-kube-api-access-gj6j8\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.797798 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-client-ca\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.798779 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-client-ca\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.798844 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-client-ca\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.798987 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-proxy-ca-bundles\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.798995 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aee9868b-274a-424f-b726-d9b03541db08-config\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.799155 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adb74f0b-39d9-40c0-842f-b2b4fabfc830-config\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.806222 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adb74f0b-39d9-40c0-842f-b2b4fabfc830-serving-cert\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.807683 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aee9868b-274a-424f-b726-d9b03541db08-serving-cert\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.817715 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj6j8\" (UniqueName: \"kubernetes.io/projected/adb74f0b-39d9-40c0-842f-b2b4fabfc830-kube-api-access-gj6j8\") pod \"controller-manager-6b689b585d-5n24c\" (UID: \"adb74f0b-39d9-40c0-842f-b2b4fabfc830\") " pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:36 crc kubenswrapper[4929]: I1002 11:22:36.824633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29p6\" (UniqueName: \"kubernetes.io/projected/aee9868b-274a-424f-b726-d9b03541db08-kube-api-access-w29p6\") pod \"route-controller-manager-668f646f8c-fc7b4\" (UID: \"aee9868b-274a-424f-b726-d9b03541db08\") " pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:37 crc kubenswrapper[4929]: I1002 11:22:37.002497 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:37 crc kubenswrapper[4929]: I1002 11:22:37.011189 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:37 crc kubenswrapper[4929]: I1002 11:22:37.217679 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b689b585d-5n24c"] Oct 02 11:22:37 crc kubenswrapper[4929]: I1002 11:22:37.305250 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4"] Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.163926 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923649eb-ddfc-4299-b94e-3f549a863233" path="/var/lib/kubelet/pods/923649eb-ddfc-4299-b94e-3f549a863233/volumes" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.164913 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c619b40e-e812-4689-aa5d-3ad89ec57afc" path="/var/lib/kubelet/pods/c619b40e-e812-4689-aa5d-3ad89ec57afc/volumes" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.165699 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" event={"ID":"aee9868b-274a-424f-b726-d9b03541db08","Type":"ContainerStarted","Data":"ddbf46a75aa3dc17f5921ddf57132dee2add4f71d74008b2c9f72537cb644b2f"} Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.165740 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.165754 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" event={"ID":"aee9868b-274a-424f-b726-d9b03541db08","Type":"ContainerStarted","Data":"ec205f6987bee4d036f461099740c4d7323c11b1deabcd300467f856a1af559f"} Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.166690 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" event={"ID":"adb74f0b-39d9-40c0-842f-b2b4fabfc830","Type":"ContainerStarted","Data":"2eeebf17c06677bee8971bf88e1efe052d854f51843db11749f53a832d0fecfe"} Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.166740 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" event={"ID":"adb74f0b-39d9-40c0-842f-b2b4fabfc830","Type":"ContainerStarted","Data":"a0bc2b0336f0d3ead597420c3457628c67b64bd97812850feb47bbe68a5ebdfa"} Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.167077 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.170900 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.172792 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.187698 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-668f646f8c-fc7b4" podStartSLOduration=3.187678277 podStartE2EDuration="3.187678277s" podCreationTimestamp="2025-10-02 11:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:22:38.185470973 +0000 UTC m=+758.735837347" watchObservedRunningTime="2025-10-02 11:22:38.187678277 +0000 UTC m=+758.738044641" Oct 02 11:22:38 crc kubenswrapper[4929]: I1002 11:22:38.204541 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b689b585d-5n24c" podStartSLOduration=3.204519979 podStartE2EDuration="3.204519979s" podCreationTimestamp="2025-10-02 11:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:22:38.201565703 +0000 UTC m=+758.751932067" watchObservedRunningTime="2025-10-02 11:22:38.204519979 +0000 UTC m=+758.754886343" Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.844782 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5fzl7"] Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845666 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="nbdb" containerID="cri-o://fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845784 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="sbdb" containerID="cri-o://b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845845 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845879 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="northd" containerID="cri-o://2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845926 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-node" containerID="cri-o://b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.845636 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-controller" containerID="cri-o://f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.846003 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-acl-logging" containerID="cri-o://000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b" gracePeriod=30 Oct 02 11:22:41 crc kubenswrapper[4929]: I1002 11:22:41.938461 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" containerID="cri-o://2f40b19688c14633cea5dde4e9ee1cf384439a3fabd89a65b3a3b3da215f6d97" gracePeriod=30 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.189309 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovnkube-controller/3.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.194364 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-acl-logging/0.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.195265 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-controller/0.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196028 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="2f40b19688c14633cea5dde4e9ee1cf384439a3fabd89a65b3a3b3da215f6d97" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196064 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196073 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196082 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196091 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196080 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"2f40b19688c14633cea5dde4e9ee1cf384439a3fabd89a65b3a3b3da215f6d97"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196101 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756" exitCode=0 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196111 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b" exitCode=143 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196119 4929 generic.go:334] "Generic (PLEG): container finished" podID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerID="f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367" exitCode=143 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196155 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196172 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196184 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196187 4929 scope.go:117] "RemoveContainer" containerID="5ef94762d3b46fb78f1a52f1e7b317762632a377d99e67ca9cdf7774c63f7fee" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196199 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196244 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.196254 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.198712 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/2.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.199257 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/1.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.199298 4929 generic.go:334] "Generic (PLEG): container finished" podID="4599e863-12c0-4c39-a873-a46012459555" containerID="0f80d407099e4b88f75718fb6019a8896ef25529261ce8f5fc038aacb4dd4c0e" exitCode=2 Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.199324 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerDied","Data":"0f80d407099e4b88f75718fb6019a8896ef25529261ce8f5fc038aacb4dd4c0e"} Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.199848 4929 scope.go:117] "RemoveContainer" containerID="0f80d407099e4b88f75718fb6019a8896ef25529261ce8f5fc038aacb4dd4c0e" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.284255 4929 scope.go:117] "RemoveContainer" containerID="d9f1e589e2668000ed35fe74af024739e9c3c65ab78ff0a22953f56a619a7f50" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.637021 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-acl-logging/0.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.637737 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-controller/0.log" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.638184 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699125 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dpch6"] Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699357 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699372 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699382 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699389 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699396 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699406 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699419 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kubecfg-setup" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699427 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kubecfg-setup" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699436 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="northd" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699443 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="northd" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699455 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-acl-logging" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699462 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-acl-logging" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699471 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699479 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699488 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="nbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699495 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="nbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699506 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699513 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699524 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="sbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699533 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="sbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699547 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-node" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699555 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-node" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699682 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699696 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699710 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovn-acl-logging" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699721 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699729 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699738 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="northd" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699747 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="nbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699755 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699764 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="sbdb" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699772 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="kube-rbac-proxy-node" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699879 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699889 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: E1002 11:22:42.699898 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.699904 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.700038 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.700053 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" containerName="ovnkube-controller" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.702176 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780498 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780560 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780589 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780601 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780611 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkltr\" (UniqueName: \"kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780636 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780639 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash" (OuterVolumeSpecName: "host-slash") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780663 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780717 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780775 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780742 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780793 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780822 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780870 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780873 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780911 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780937 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780973 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log" (OuterVolumeSpecName: "node-log") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.780978 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781010 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781013 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781088 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781120 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781148 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781153 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket" (OuterVolumeSpecName: "log-socket") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781166 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781181 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781194 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781227 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781193 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781211 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781257 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781337 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781368 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config\") pod \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\" (UID: \"5862ad0e-b703-4706-a7b4-25e4fdf5f78e\") " Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781443 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781612 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781858 4929 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781878 4929 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781886 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781891 4929 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781923 4929 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781935 4929 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781946 4929 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781975 4929 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781988 4929 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.781998 4929 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782010 4929 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782021 4929 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782033 4929 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782044 4929 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782055 4929 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782065 4929 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.782076 4929 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.786246 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr" (OuterVolumeSpecName: "kube-api-access-pkltr") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "kube-api-access-pkltr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.786649 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.793536 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5862ad0e-b703-4706-a7b4-25e4fdf5f78e" (UID: "5862ad0e-b703-4706-a7b4-25e4fdf5f78e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882702 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-kubelet\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882741 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-node-log\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-config\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882778 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-slash\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882798 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-systemd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882860 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-bin\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-ovn\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.882994 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-netns\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883012 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883034 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-var-lib-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883079 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-systemd-units\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883098 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-log-socket\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883185 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-netd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883229 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-etc-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883256 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-env-overrides\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883315 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-script-lib\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883339 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/453df8e9-ff66-46c1-9aff-a11dff132acb-ovn-node-metrics-cert\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883367 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4hh\" (UniqueName: \"kubernetes.io/projected/453df8e9-ff66-46c1-9aff-a11dff132acb-kube-api-access-4l4hh\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883433 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883446 4929 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883457 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkltr\" (UniqueName: \"kubernetes.io/projected/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-kube-api-access-pkltr\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.883469 4929 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5862ad0e-b703-4706-a7b4-25e4fdf5f78e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.984829 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-netd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985115 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-netd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985162 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-etc-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985121 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-etc-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985210 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-env-overrides\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985243 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-script-lib\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985267 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/453df8e9-ff66-46c1-9aff-a11dff132acb-ovn-node-metrics-cert\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4hh\" (UniqueName: \"kubernetes.io/projected/453df8e9-ff66-46c1-9aff-a11dff132acb-kube-api-access-4l4hh\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-config\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985325 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-kubelet\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-node-log\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985362 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-slash\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985380 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-systemd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985402 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985420 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-bin\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985440 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-ovn\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985458 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-netns\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985471 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985486 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985506 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-var-lib-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985524 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-systemd-units\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985541 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-log-socket\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985586 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-log-socket\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985771 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-env-overrides\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.985811 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986202 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-script-lib\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986240 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-cni-bin\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986263 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-ovn\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986283 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-netns\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986302 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986325 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986345 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-var-lib-openvswitch\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986366 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-systemd-units\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986393 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-node-log\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986626 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-kubelet\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986677 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-host-slash\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986691 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/453df8e9-ff66-46c1-9aff-a11dff132acb-run-systemd\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.986785 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/453df8e9-ff66-46c1-9aff-a11dff132acb-ovnkube-config\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:42 crc kubenswrapper[4929]: I1002 11:22:42.992693 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/453df8e9-ff66-46c1-9aff-a11dff132acb-ovn-node-metrics-cert\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.007850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4hh\" (UniqueName: \"kubernetes.io/projected/453df8e9-ff66-46c1-9aff-a11dff132acb-kube-api-access-4l4hh\") pod \"ovnkube-node-dpch6\" (UID: \"453df8e9-ff66-46c1-9aff-a11dff132acb\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.015847 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:43 crc kubenswrapper[4929]: W1002 11:22:43.037268 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453df8e9_ff66_46c1_9aff_a11dff132acb.slice/crio-dacea08e402b223c76613ae6f822c9ca16038cd00cbf7e739f2e6ac1fadbded0 WatchSource:0}: Error finding container dacea08e402b223c76613ae6f822c9ca16038cd00cbf7e739f2e6ac1fadbded0: Status 404 returned error can't find the container with id dacea08e402b223c76613ae6f822c9ca16038cd00cbf7e739f2e6ac1fadbded0 Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.206419 4929 generic.go:334] "Generic (PLEG): container finished" podID="453df8e9-ff66-46c1-9aff-a11dff132acb" containerID="99fd6c82a1bb0a051a39f88b402401f59c4072ef1719cfe9a04b3129e4bae319" exitCode=0 Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.206507 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerDied","Data":"99fd6c82a1bb0a051a39f88b402401f59c4072ef1719cfe9a04b3129e4bae319"} Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.206559 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"dacea08e402b223c76613ae6f822c9ca16038cd00cbf7e739f2e6ac1fadbded0"} Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.212062 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-acl-logging/0.log" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.212600 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fzl7_5862ad0e-b703-4706-a7b4-25e4fdf5f78e/ovn-controller/0.log" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.212999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" event={"ID":"5862ad0e-b703-4706-a7b4-25e4fdf5f78e","Type":"ContainerDied","Data":"4cd6b6c8789c6571117bbaf188272d65f466116dd57d1acea20cfebca7c30f33"} Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.213034 4929 scope.go:117] "RemoveContainer" containerID="2f40b19688c14633cea5dde4e9ee1cf384439a3fabd89a65b3a3b3da215f6d97" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.213071 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fzl7" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.217008 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gbz4b_4599e863-12c0-4c39-a873-a46012459555/kube-multus/2.log" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.217113 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gbz4b" event={"ID":"4599e863-12c0-4c39-a873-a46012459555","Type":"ContainerStarted","Data":"6ae68413b92037ba1ac3a08bc6f3bc0eb1430c845eb0c352a9310fa52efbd049"} Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.266624 4929 scope.go:117] "RemoveContainer" containerID="b009969e3ff6a954795e06deb52cdc20c2e5603263c0eb7309418235b2438e9d" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.283268 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5fzl7"] Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.285671 4929 scope.go:117] "RemoveContainer" containerID="fe31e192363656294d7e1519af542db55fb9b8edf649bfe2f06b8b9e746d0537" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.287432 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5fzl7"] Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.315271 4929 scope.go:117] "RemoveContainer" containerID="2f25fde8b377f2de73b7ed8b785512fc2d66c5ccf3ae0e0b8a568def458d4d6a" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.333419 4929 scope.go:117] "RemoveContainer" containerID="397f37e255562f8f68f9d76e1fd761645d147a9bd2fd784f619732339fb50656" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.349980 4929 scope.go:117] "RemoveContainer" containerID="b48a14888ce8141cee720fde54ce779d123f82637a49253b138b3900027af756" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.362950 4929 scope.go:117] "RemoveContainer" containerID="000230ceb00c439376d1cd1c0db04773f5aee296e5338d762aab0fa27087371b" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.378925 4929 scope.go:117] "RemoveContainer" containerID="f38073a9e6dd61cef1eb7858d734ca2426fc19702263ed80ffd293384d469367" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.406149 4929 scope.go:117] "RemoveContainer" containerID="8a8c0956a4e83e81d9ea2c7a1dca44c36249517cff037496fcd5e5ebfcb36054" Oct 02 11:22:43 crc kubenswrapper[4929]: I1002 11:22:43.752166 4929 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.162871 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5862ad0e-b703-4706-a7b4-25e4fdf5f78e" path="/var/lib/kubelet/pods/5862ad0e-b703-4706-a7b4-25e4fdf5f78e/volumes" Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"3c42b75470e9d6e215709b5f7bace3ef6f842e52108d575932db49216e1e0469"} Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223384 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"773b01f8cc2e8198ffe081c06b0349edd0a3c6a51d7cf0ed827af0f19af4fe06"} Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223395 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"c7aed7ed4092c438ddb2074a1a950ca7e73da086bad97fbc6a40c01389c283c0"} Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223404 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"ae65435e72d1747030f7395d439890b7dd938ea327d22096823d426569f3dadb"} Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223413 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"8feb18f59d2173fba0ccdb94866966f2a7be3b01c3d6814bc84186321c40e6f9"} Oct 02 11:22:44 crc kubenswrapper[4929]: I1002 11:22:44.223421 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"88ade3c9566963691c6f2a54b25bdf9be4097ddab727b403e717bdf8ec0cbd9c"} Oct 02 11:22:46 crc kubenswrapper[4929]: I1002 11:22:46.247708 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"c081a25582f4b9befaf6ec6792086ee312338a95dcb56ae6c1348f9599a234ae"} Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.583063 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-2gmbm"] Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.584679 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.586458 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.586503 4929 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-2cfdb" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.586896 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.586939 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.742499 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.742586 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94fvw\" (UniqueName: \"kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.742644 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.844264 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.844348 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94fvw\" (UniqueName: \"kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.844401 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.844725 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.845377 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.863746 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94fvw\" (UniqueName: \"kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw\") pod \"crc-storage-crc-2gmbm\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: I1002 11:22:47.901373 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: E1002 11:22:47.935263 4929 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(b90671588a90db06096b653102830d596d6f5c638e77a99b7650fad57876b479): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:22:47 crc kubenswrapper[4929]: E1002 11:22:47.935419 4929 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(b90671588a90db06096b653102830d596d6f5c638e77a99b7650fad57876b479): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: E1002 11:22:47.935469 4929 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(b90671588a90db06096b653102830d596d6f5c638e77a99b7650fad57876b479): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:47 crc kubenswrapper[4929]: E1002 11:22:47.935750 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-2gmbm_crc-storage(816c4698-4b96-4335-9661-ce6f6031fb6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-2gmbm_crc-storage(816c4698-4b96-4335-9661-ce6f6031fb6c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(b90671588a90db06096b653102830d596d6f5c638e77a99b7650fad57876b479): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-2gmbm" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.283020 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" event={"ID":"453df8e9-ff66-46c1-9aff-a11dff132acb","Type":"ContainerStarted","Data":"cba843c67cb1048a2325d50977894b5d9cfc6cd5f6fda899e945b60b58ca18f1"} Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.283429 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.283448 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.308684 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" podStartSLOduration=10.308663519 podStartE2EDuration="10.308663519s" podCreationTimestamp="2025-10-02 11:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:22:52.306157616 +0000 UTC m=+772.856523980" watchObservedRunningTime="2025-10-02 11:22:52.308663519 +0000 UTC m=+772.859029883" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.313875 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.454379 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2gmbm"] Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.454508 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:52 crc kubenswrapper[4929]: I1002 11:22:52.454884 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:52 crc kubenswrapper[4929]: E1002 11:22:52.476459 4929 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(e3b07d4a8f3798f07454c3d82bae5e54e1cab14a889dccb05a8a4a3f6ec4a1cb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:22:52 crc kubenswrapper[4929]: E1002 11:22:52.476519 4929 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(e3b07d4a8f3798f07454c3d82bae5e54e1cab14a889dccb05a8a4a3f6ec4a1cb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:52 crc kubenswrapper[4929]: E1002 11:22:52.476542 4929 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(e3b07d4a8f3798f07454c3d82bae5e54e1cab14a889dccb05a8a4a3f6ec4a1cb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:22:52 crc kubenswrapper[4929]: E1002 11:22:52.476606 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-2gmbm_crc-storage(816c4698-4b96-4335-9661-ce6f6031fb6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-2gmbm_crc-storage(816c4698-4b96-4335-9661-ce6f6031fb6c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-2gmbm_crc-storage_816c4698-4b96-4335-9661-ce6f6031fb6c_0(e3b07d4a8f3798f07454c3d82bae5e54e1cab14a889dccb05a8a4a3f6ec4a1cb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-2gmbm" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" Oct 02 11:22:53 crc kubenswrapper[4929]: I1002 11:22:53.016548 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:22:53 crc kubenswrapper[4929]: I1002 11:22:53.049419 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:23:07 crc kubenswrapper[4929]: I1002 11:23:07.156525 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:23:07 crc kubenswrapper[4929]: I1002 11:23:07.157514 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:23:07 crc kubenswrapper[4929]: I1002 11:23:07.541871 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-2gmbm"] Oct 02 11:23:07 crc kubenswrapper[4929]: W1002 11:23:07.546722 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod816c4698_4b96_4335_9661_ce6f6031fb6c.slice/crio-70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a WatchSource:0}: Error finding container 70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a: Status 404 returned error can't find the container with id 70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a Oct 02 11:23:07 crc kubenswrapper[4929]: I1002 11:23:07.549211 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:23:08 crc kubenswrapper[4929]: I1002 11:23:08.394352 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2gmbm" event={"ID":"816c4698-4b96-4335-9661-ce6f6031fb6c","Type":"ContainerStarted","Data":"70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a"} Oct 02 11:23:11 crc kubenswrapper[4929]: I1002 11:23:11.409595 4929 generic.go:334] "Generic (PLEG): container finished" podID="816c4698-4b96-4335-9661-ce6f6031fb6c" containerID="f8851a997bc198fa21e046849aa89a3c55baaaad2903035da1f9eebc403707ec" exitCode=0 Oct 02 11:23:11 crc kubenswrapper[4929]: I1002 11:23:11.409684 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2gmbm" event={"ID":"816c4698-4b96-4335-9661-ce6f6031fb6c","Type":"ContainerDied","Data":"f8851a997bc198fa21e046849aa89a3c55baaaad2903035da1f9eebc403707ec"} Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.701504 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.718925 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94fvw\" (UniqueName: \"kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw\") pod \"816c4698-4b96-4335-9661-ce6f6031fb6c\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.719055 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage\") pod \"816c4698-4b96-4335-9661-ce6f6031fb6c\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.719078 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt\") pod \"816c4698-4b96-4335-9661-ce6f6031fb6c\" (UID: \"816c4698-4b96-4335-9661-ce6f6031fb6c\") " Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.719478 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "816c4698-4b96-4335-9661-ce6f6031fb6c" (UID: "816c4698-4b96-4335-9661-ce6f6031fb6c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.726694 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw" (OuterVolumeSpecName: "kube-api-access-94fvw") pod "816c4698-4b96-4335-9661-ce6f6031fb6c" (UID: "816c4698-4b96-4335-9661-ce6f6031fb6c"). InnerVolumeSpecName "kube-api-access-94fvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.737878 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "816c4698-4b96-4335-9661-ce6f6031fb6c" (UID: "816c4698-4b96-4335-9661-ce6f6031fb6c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.820712 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94fvw\" (UniqueName: \"kubernetes.io/projected/816c4698-4b96-4335-9661-ce6f6031fb6c-kube-api-access-94fvw\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.820771 4929 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/816c4698-4b96-4335-9661-ce6f6031fb6c-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:12 crc kubenswrapper[4929]: I1002 11:23:12.820782 4929 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/816c4698-4b96-4335-9661-ce6f6031fb6c-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:13 crc kubenswrapper[4929]: I1002 11:23:13.053365 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpch6" Oct 02 11:23:13 crc kubenswrapper[4929]: I1002 11:23:13.423250 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-2gmbm" event={"ID":"816c4698-4b96-4335-9661-ce6f6031fb6c","Type":"ContainerDied","Data":"70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a"} Oct 02 11:23:13 crc kubenswrapper[4929]: I1002 11:23:13.423287 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70dbe6d3b0772cb8f224d68a8ef6a3a98c8daabe12d875cc8bc400e4a46f941a" Oct 02 11:23:13 crc kubenswrapper[4929]: I1002 11:23:13.423303 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-2gmbm" Oct 02 11:23:14 crc kubenswrapper[4929]: I1002 11:23:14.736690 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:23:14 crc kubenswrapper[4929]: I1002 11:23:14.736772 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.580685 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v"] Oct 02 11:23:19 crc kubenswrapper[4929]: E1002 11:23:19.581393 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" containerName="storage" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.581406 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" containerName="storage" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.581520 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" containerName="storage" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.582287 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.583953 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.592456 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v"] Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.605172 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb9p9\" (UniqueName: \"kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.605225 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.605246 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.706990 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb9p9\" (UniqueName: \"kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.707083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.707111 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.708021 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.708174 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.727495 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb9p9\" (UniqueName: \"kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:19 crc kubenswrapper[4929]: I1002 11:23:19.897052 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:20 crc kubenswrapper[4929]: I1002 11:23:20.288672 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v"] Oct 02 11:23:20 crc kubenswrapper[4929]: I1002 11:23:20.460033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" event={"ID":"0986f61a-c4cc-426b-861c-343816225e99","Type":"ContainerStarted","Data":"3e070924432d55d4a45b162a570f97e107013627b8e3c565c78a434d82e57532"} Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.468538 4929 generic.go:334] "Generic (PLEG): container finished" podID="0986f61a-c4cc-426b-861c-343816225e99" containerID="a29dfa5eac0b7f779b227c53bcbc5d9d89e6205f1dc9d23031ac4fda795629da" exitCode=0 Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.468587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" event={"ID":"0986f61a-c4cc-426b-861c-343816225e99","Type":"ContainerDied","Data":"a29dfa5eac0b7f779b227c53bcbc5d9d89e6205f1dc9d23031ac4fda795629da"} Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.868874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.870922 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.879585 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.934197 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.934247 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:21 crc kubenswrapper[4929]: I1002 11:23:21.934326 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvpdk\" (UniqueName: \"kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.037666 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.037723 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.037807 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvpdk\" (UniqueName: \"kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.038487 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.038652 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.057899 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvpdk\" (UniqueName: \"kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk\") pod \"redhat-operators-6dght\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.197594 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.402386 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:22 crc kubenswrapper[4929]: I1002 11:23:22.474935 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerStarted","Data":"c0a97d42a7cb36e84e037a02d8446d486baace1f6d6fad3ce83bf106828016aa"} Oct 02 11:23:23 crc kubenswrapper[4929]: I1002 11:23:23.481195 4929 generic.go:334] "Generic (PLEG): container finished" podID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerID="a34db5d9b21b7867dade443deb88b4f79edec3f6d7700a3814e8514c89790ebd" exitCode=0 Oct 02 11:23:23 crc kubenswrapper[4929]: I1002 11:23:23.481363 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerDied","Data":"a34db5d9b21b7867dade443deb88b4f79edec3f6d7700a3814e8514c89790ebd"} Oct 02 11:23:24 crc kubenswrapper[4929]: I1002 11:23:24.488256 4929 generic.go:334] "Generic (PLEG): container finished" podID="0986f61a-c4cc-426b-861c-343816225e99" containerID="358816758a05f5a0571ebd65bec1b3a405f4a10aa1f325b42ed8fea28d8d3514" exitCode=0 Oct 02 11:23:24 crc kubenswrapper[4929]: I1002 11:23:24.488304 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" event={"ID":"0986f61a-c4cc-426b-861c-343816225e99","Type":"ContainerDied","Data":"358816758a05f5a0571ebd65bec1b3a405f4a10aa1f325b42ed8fea28d8d3514"} Oct 02 11:23:25 crc kubenswrapper[4929]: I1002 11:23:25.495314 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerStarted","Data":"55ee2ac82dbd6a51fbb92f6d22bb5e033da0a02ee4a41c2257a2cc582d373c81"} Oct 02 11:23:25 crc kubenswrapper[4929]: I1002 11:23:25.497740 4929 generic.go:334] "Generic (PLEG): container finished" podID="0986f61a-c4cc-426b-861c-343816225e99" containerID="9e11d24a86f0593ab64343064055f8c6e8ac2ceb6317cac9dc60307ae907e921" exitCode=0 Oct 02 11:23:25 crc kubenswrapper[4929]: I1002 11:23:25.497767 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" event={"ID":"0986f61a-c4cc-426b-861c-343816225e99","Type":"ContainerDied","Data":"9e11d24a86f0593ab64343064055f8c6e8ac2ceb6317cac9dc60307ae907e921"} Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.506732 4929 generic.go:334] "Generic (PLEG): container finished" podID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerID="55ee2ac82dbd6a51fbb92f6d22bb5e033da0a02ee4a41c2257a2cc582d373c81" exitCode=0 Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.507091 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerDied","Data":"55ee2ac82dbd6a51fbb92f6d22bb5e033da0a02ee4a41c2257a2cc582d373c81"} Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.729731 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.792713 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle\") pod \"0986f61a-c4cc-426b-861c-343816225e99\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.792937 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util\") pod \"0986f61a-c4cc-426b-861c-343816225e99\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.793024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb9p9\" (UniqueName: \"kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9\") pod \"0986f61a-c4cc-426b-861c-343816225e99\" (UID: \"0986f61a-c4cc-426b-861c-343816225e99\") " Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.799476 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle" (OuterVolumeSpecName: "bundle") pod "0986f61a-c4cc-426b-861c-343816225e99" (UID: "0986f61a-c4cc-426b-861c-343816225e99"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.801564 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9" (OuterVolumeSpecName: "kube-api-access-fb9p9") pod "0986f61a-c4cc-426b-861c-343816225e99" (UID: "0986f61a-c4cc-426b-861c-343816225e99"). InnerVolumeSpecName "kube-api-access-fb9p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.894767 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb9p9\" (UniqueName: \"kubernetes.io/projected/0986f61a-c4cc-426b-861c-343816225e99-kube-api-access-fb9p9\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.894789 4929 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.913939 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util" (OuterVolumeSpecName: "util") pod "0986f61a-c4cc-426b-861c-343816225e99" (UID: "0986f61a-c4cc-426b-861c-343816225e99"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:26 crc kubenswrapper[4929]: I1002 11:23:26.995827 4929 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0986f61a-c4cc-426b-861c-343816225e99-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:27 crc kubenswrapper[4929]: I1002 11:23:27.513508 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" event={"ID":"0986f61a-c4cc-426b-861c-343816225e99","Type":"ContainerDied","Data":"3e070924432d55d4a45b162a570f97e107013627b8e3c565c78a434d82e57532"} Oct 02 11:23:27 crc kubenswrapper[4929]: I1002 11:23:27.513821 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e070924432d55d4a45b162a570f97e107013627b8e3c565c78a434d82e57532" Oct 02 11:23:27 crc kubenswrapper[4929]: I1002 11:23:27.513542 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v" Oct 02 11:23:27 crc kubenswrapper[4929]: I1002 11:23:27.516036 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerStarted","Data":"c12046180c61e148f59812ac004c8f8dd526bd9a81c72d3e457008882303cf06"} Oct 02 11:23:27 crc kubenswrapper[4929]: I1002 11:23:27.536600 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6dght" podStartSLOduration=2.921286682 podStartE2EDuration="6.536575429s" podCreationTimestamp="2025-10-02 11:23:21 +0000 UTC" firstStartedPulling="2025-10-02 11:23:23.482873896 +0000 UTC m=+804.033240260" lastFinishedPulling="2025-10-02 11:23:27.098162653 +0000 UTC m=+807.648529007" observedRunningTime="2025-10-02 11:23:27.534077615 +0000 UTC m=+808.084443989" watchObservedRunningTime="2025-10-02 11:23:27.536575429 +0000 UTC m=+808.086941813" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.309227 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv"] Oct 02 11:23:30 crc kubenswrapper[4929]: E1002 11:23:30.309661 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="util" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.309674 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="util" Oct 02 11:23:30 crc kubenswrapper[4929]: E1002 11:23:30.309689 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="extract" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.309694 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="extract" Oct 02 11:23:30 crc kubenswrapper[4929]: E1002 11:23:30.309705 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="pull" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.309710 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="pull" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.309798 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0986f61a-c4cc-426b-861c-343816225e99" containerName="extract" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.310141 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.312486 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.312618 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-qgk7h" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.312717 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.318339 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv"] Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.345522 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rhrm\" (UniqueName: \"kubernetes.io/projected/7364e0e7-ec72-4345-bd2d-09eaf28a4db4-kube-api-access-5rhrm\") pod \"nmstate-operator-858ddd8f98-g6tqv\" (UID: \"7364e0e7-ec72-4345-bd2d-09eaf28a4db4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.446657 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rhrm\" (UniqueName: \"kubernetes.io/projected/7364e0e7-ec72-4345-bd2d-09eaf28a4db4-kube-api-access-5rhrm\") pod \"nmstate-operator-858ddd8f98-g6tqv\" (UID: \"7364e0e7-ec72-4345-bd2d-09eaf28a4db4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.463621 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rhrm\" (UniqueName: \"kubernetes.io/projected/7364e0e7-ec72-4345-bd2d-09eaf28a4db4-kube-api-access-5rhrm\") pod \"nmstate-operator-858ddd8f98-g6tqv\" (UID: \"7364e0e7-ec72-4345-bd2d-09eaf28a4db4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" Oct 02 11:23:30 crc kubenswrapper[4929]: I1002 11:23:30.626690 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" Oct 02 11:23:31 crc kubenswrapper[4929]: I1002 11:23:31.015498 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv"] Oct 02 11:23:31 crc kubenswrapper[4929]: I1002 11:23:31.533981 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" event={"ID":"7364e0e7-ec72-4345-bd2d-09eaf28a4db4","Type":"ContainerStarted","Data":"a5b68980f25d8f53a4daf1e1cc42a273cbe73eab3da3921825e9fa1d47e2f755"} Oct 02 11:23:32 crc kubenswrapper[4929]: I1002 11:23:32.198495 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:32 crc kubenswrapper[4929]: I1002 11:23:32.198594 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:32 crc kubenswrapper[4929]: I1002 11:23:32.251610 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:32 crc kubenswrapper[4929]: I1002 11:23:32.574318 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.262045 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.263063 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.273645 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.379062 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.379178 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.379212 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdq64\" (UniqueName: \"kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.480582 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.480648 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.480670 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdq64\" (UniqueName: \"kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.481313 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.481341 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.503820 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdq64\" (UniqueName: \"kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64\") pod \"certified-operators-wlsg2\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:33 crc kubenswrapper[4929]: I1002 11:23:33.580979 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:34 crc kubenswrapper[4929]: I1002 11:23:34.028984 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:34 crc kubenswrapper[4929]: W1002 11:23:34.032623 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6417cdf0_d142_42b8_a604_a9ab007c92d3.slice/crio-4e2e445cf71cce82a03b2b718328a79916d0c7efd323dc6ae236a51650019b56 WatchSource:0}: Error finding container 4e2e445cf71cce82a03b2b718328a79916d0c7efd323dc6ae236a51650019b56: Status 404 returned error can't find the container with id 4e2e445cf71cce82a03b2b718328a79916d0c7efd323dc6ae236a51650019b56 Oct 02 11:23:34 crc kubenswrapper[4929]: I1002 11:23:34.549567 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerStarted","Data":"4e2e445cf71cce82a03b2b718328a79916d0c7efd323dc6ae236a51650019b56"} Oct 02 11:23:35 crc kubenswrapper[4929]: I1002 11:23:35.558322 4929 generic.go:334] "Generic (PLEG): container finished" podID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerID="ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5" exitCode=0 Oct 02 11:23:35 crc kubenswrapper[4929]: I1002 11:23:35.558362 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerDied","Data":"ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5"} Oct 02 11:23:35 crc kubenswrapper[4929]: I1002 11:23:35.856901 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:35 crc kubenswrapper[4929]: I1002 11:23:35.857605 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6dght" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="registry-server" containerID="cri-o://c12046180c61e148f59812ac004c8f8dd526bd9a81c72d3e457008882303cf06" gracePeriod=2 Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.566972 4929 generic.go:334] "Generic (PLEG): container finished" podID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerID="c12046180c61e148f59812ac004c8f8dd526bd9a81c72d3e457008882303cf06" exitCode=0 Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.567009 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerDied","Data":"c12046180c61e148f59812ac004c8f8dd526bd9a81c72d3e457008882303cf06"} Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.815468 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.824064 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvpdk\" (UniqueName: \"kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk\") pod \"667150be-d3bc-4026-baf3-f2998d02fa9c\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.833386 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk" (OuterVolumeSpecName: "kube-api-access-gvpdk") pod "667150be-d3bc-4026-baf3-f2998d02fa9c" (UID: "667150be-d3bc-4026-baf3-f2998d02fa9c"). InnerVolumeSpecName "kube-api-access-gvpdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.924862 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content\") pod \"667150be-d3bc-4026-baf3-f2998d02fa9c\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.925066 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities\") pod \"667150be-d3bc-4026-baf3-f2998d02fa9c\" (UID: \"667150be-d3bc-4026-baf3-f2998d02fa9c\") " Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.925322 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvpdk\" (UniqueName: \"kubernetes.io/projected/667150be-d3bc-4026-baf3-f2998d02fa9c-kube-api-access-gvpdk\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:36 crc kubenswrapper[4929]: I1002 11:23:36.926093 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities" (OuterVolumeSpecName: "utilities") pod "667150be-d3bc-4026-baf3-f2998d02fa9c" (UID: "667150be-d3bc-4026-baf3-f2998d02fa9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.017797 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "667150be-d3bc-4026-baf3-f2998d02fa9c" (UID: "667150be-d3bc-4026-baf3-f2998d02fa9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.027219 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.027273 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/667150be-d3bc-4026-baf3-f2998d02fa9c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.574634 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6dght" event={"ID":"667150be-d3bc-4026-baf3-f2998d02fa9c","Type":"ContainerDied","Data":"c0a97d42a7cb36e84e037a02d8446d486baace1f6d6fad3ce83bf106828016aa"} Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.574723 4929 scope.go:117] "RemoveContainer" containerID="c12046180c61e148f59812ac004c8f8dd526bd9a81c72d3e457008882303cf06" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.574740 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6dght" Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.610520 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:37 crc kubenswrapper[4929]: I1002 11:23:37.615387 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6dght"] Oct 02 11:23:38 crc kubenswrapper[4929]: I1002 11:23:38.164993 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" path="/var/lib/kubelet/pods/667150be-d3bc-4026-baf3-f2998d02fa9c/volumes" Oct 02 11:23:39 crc kubenswrapper[4929]: I1002 11:23:39.922153 4929 scope.go:117] "RemoveContainer" containerID="55ee2ac82dbd6a51fbb92f6d22bb5e033da0a02ee4a41c2257a2cc582d373c81" Oct 02 11:23:40 crc kubenswrapper[4929]: I1002 11:23:40.078940 4929 scope.go:117] "RemoveContainer" containerID="a34db5d9b21b7867dade443deb88b4f79edec3f6d7700a3814e8514c89790ebd" Oct 02 11:23:41 crc kubenswrapper[4929]: I1002 11:23:41.602344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" event={"ID":"7364e0e7-ec72-4345-bd2d-09eaf28a4db4","Type":"ContainerStarted","Data":"df930b59c6ab980490a2ac7e20772361883655d5aebbac6eff2f5eb7132ba596"} Oct 02 11:23:41 crc kubenswrapper[4929]: I1002 11:23:41.606865 4929 generic.go:334] "Generic (PLEG): container finished" podID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerID="f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b" exitCode=0 Oct 02 11:23:41 crc kubenswrapper[4929]: I1002 11:23:41.606927 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerDied","Data":"f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b"} Oct 02 11:23:41 crc kubenswrapper[4929]: I1002 11:23:41.636680 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g6tqv" podStartSLOduration=2.313364316 podStartE2EDuration="11.636660784s" podCreationTimestamp="2025-10-02 11:23:30 +0000 UTC" firstStartedPulling="2025-10-02 11:23:31.022998392 +0000 UTC m=+811.573364756" lastFinishedPulling="2025-10-02 11:23:40.34629486 +0000 UTC m=+820.896661224" observedRunningTime="2025-10-02 11:23:41.633775528 +0000 UTC m=+822.184141932" watchObservedRunningTime="2025-10-02 11:23:41.636660784 +0000 UTC m=+822.187027148" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.499857 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j"] Oct 02 11:23:42 crc kubenswrapper[4929]: E1002 11:23:42.500113 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="extract-content" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.500136 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="extract-content" Oct 02 11:23:42 crc kubenswrapper[4929]: E1002 11:23:42.500151 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="extract-utilities" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.500158 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="extract-utilities" Oct 02 11:23:42 crc kubenswrapper[4929]: E1002 11:23:42.500168 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="registry-server" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.500176 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="registry-server" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.500292 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="667150be-d3bc-4026-baf3-f2998d02fa9c" containerName="registry-server" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.500919 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.502746 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-7w4fk" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.531044 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.534652 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.535451 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.537039 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.551010 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hmcq8"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.551843 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.565540 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.632259 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.634030 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.636106 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.636391 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.636574 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rc9kx" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.663664 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.697507 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-dbus-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698276 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-ovs-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698332 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wfr\" (UniqueName: \"kubernetes.io/projected/45290c56-c1df-4f53-a3ed-796fd6624bab-kube-api-access-p7wfr\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698380 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznjm\" (UniqueName: \"kubernetes.io/projected/a6096daf-c096-473d-b53c-07af5d99a7ee-kube-api-access-cznjm\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-nmstate-lock\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698556 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45290c56-c1df-4f53-a3ed-796fd6624bab-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.698581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql7hf\" (UniqueName: \"kubernetes.io/projected/d1617fbb-7eac-484e-bf5f-ec3083165f47-kube-api-access-ql7hf\") pod \"nmstate-metrics-fdff9cb8d-xvd9j\" (UID: \"d1617fbb-7eac-484e-bf5f-ec3083165f47\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45290c56-c1df-4f53-a3ed-796fd6624bab-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799435 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql7hf\" (UniqueName: \"kubernetes.io/projected/d1617fbb-7eac-484e-bf5f-ec3083165f47-kube-api-access-ql7hf\") pod \"nmstate-metrics-fdff9cb8d-xvd9j\" (UID: \"d1617fbb-7eac-484e-bf5f-ec3083165f47\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799468 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hvf\" (UniqueName: \"kubernetes.io/projected/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-kube-api-access-62hvf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799502 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-dbus-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799525 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799555 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-ovs-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799585 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wfr\" (UniqueName: \"kubernetes.io/projected/45290c56-c1df-4f53-a3ed-796fd6624bab-kube-api-access-p7wfr\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznjm\" (UniqueName: \"kubernetes.io/projected/a6096daf-c096-473d-b53c-07af5d99a7ee-kube-api-access-cznjm\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-nmstate-lock\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799676 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-ovs-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799695 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799778 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-nmstate-lock\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.799925 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a6096daf-c096-473d-b53c-07af5d99a7ee-dbus-socket\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.821106 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/45290c56-c1df-4f53-a3ed-796fd6624bab-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.823867 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wfr\" (UniqueName: \"kubernetes.io/projected/45290c56-c1df-4f53-a3ed-796fd6624bab-kube-api-access-p7wfr\") pod \"nmstate-webhook-6cdbc54649-q7xjr\" (UID: \"45290c56-c1df-4f53-a3ed-796fd6624bab\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.826109 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql7hf\" (UniqueName: \"kubernetes.io/projected/d1617fbb-7eac-484e-bf5f-ec3083165f47-kube-api-access-ql7hf\") pod \"nmstate-metrics-fdff9cb8d-xvd9j\" (UID: \"d1617fbb-7eac-484e-bf5f-ec3083165f47\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.827734 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.827995 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznjm\" (UniqueName: \"kubernetes.io/projected/a6096daf-c096-473d-b53c-07af5d99a7ee-kube-api-access-cznjm\") pod \"nmstate-handler-hmcq8\" (UID: \"a6096daf-c096-473d-b53c-07af5d99a7ee\") " pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.858293 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.872685 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.901315 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62hvf\" (UniqueName: \"kubernetes.io/projected/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-kube-api-access-62hvf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.901359 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.901425 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: E1002 11:23:42.901755 4929 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 11:23:42 crc kubenswrapper[4929]: E1002 11:23:42.901832 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert podName:fcde8b6a-6d5a-47a8-8d66-78bfc8934410 nodeName:}" failed. No retries permitted until 2025-10-02 11:23:43.40181018 +0000 UTC m=+823.952176544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-n52n5" (UID: "fcde8b6a-6d5a-47a8-8d66-78bfc8934410") : secret "plugin-serving-cert" not found Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.902243 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.917181 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55fbcbdddd-nczbs"] Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.917914 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.921922 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hvf\" (UniqueName: \"kubernetes.io/projected/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-kube-api-access-62hvf\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:42 crc kubenswrapper[4929]: I1002 11:23:42.939816 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55fbcbdddd-nczbs"] Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtcxt\" (UniqueName: \"kubernetes.io/projected/db21a236-a1c3-4221-8f52-197d9ef69379-kube-api-access-mtcxt\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111410 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-oauth-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111454 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-oauth-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111476 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-service-ca\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111513 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-trusted-ca-bundle\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111533 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-console-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.111570 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.115380 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j"] Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.212816 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.212878 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtcxt\" (UniqueName: \"kubernetes.io/projected/db21a236-a1c3-4221-8f52-197d9ef69379-kube-api-access-mtcxt\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.212927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-oauth-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.212978 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-oauth-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.213000 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-service-ca\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.213014 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-trusted-ca-bundle\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.213033 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-console-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.214135 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-console-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.214128 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-service-ca\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.214483 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-oauth-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.214778 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db21a236-a1c3-4221-8f52-197d9ef69379-trusted-ca-bundle\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.218334 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-oauth-config\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.218602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db21a236-a1c3-4221-8f52-197d9ef69379-console-serving-cert\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.227657 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtcxt\" (UniqueName: \"kubernetes.io/projected/db21a236-a1c3-4221-8f52-197d9ef69379-kube-api-access-mtcxt\") pod \"console-55fbcbdddd-nczbs\" (UID: \"db21a236-a1c3-4221-8f52-197d9ef69379\") " pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.238233 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.380297 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr"] Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.415510 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.420230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcde8b6a-6d5a-47a8-8d66-78bfc8934410-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-n52n5\" (UID: \"fcde8b6a-6d5a-47a8-8d66-78bfc8934410\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.440561 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55fbcbdddd-nczbs"] Oct 02 11:23:43 crc kubenswrapper[4929]: W1002 11:23:43.445246 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb21a236_a1c3_4221_8f52_197d9ef69379.slice/crio-fd5ec84edbcdd0e2de573ee9a6a36801e2ab73cfdf95f275698ed89e53f51f04 WatchSource:0}: Error finding container fd5ec84edbcdd0e2de573ee9a6a36801e2ab73cfdf95f275698ed89e53f51f04: Status 404 returned error can't find the container with id fd5ec84edbcdd0e2de573ee9a6a36801e2ab73cfdf95f275698ed89e53f51f04 Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.578800 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.619121 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" event={"ID":"45290c56-c1df-4f53-a3ed-796fd6624bab","Type":"ContainerStarted","Data":"c327cc0b62ad39cae694c7170285abbf0a3db5e940090d3dc3a3763ddf65de3f"} Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.620788 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" event={"ID":"d1617fbb-7eac-484e-bf5f-ec3083165f47","Type":"ContainerStarted","Data":"5beec7bdad5c50c8b5a13322635de719adf28394d144d5a54f9c704e1b15d255"} Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.625171 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerStarted","Data":"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59"} Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.626913 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55fbcbdddd-nczbs" event={"ID":"db21a236-a1c3-4221-8f52-197d9ef69379","Type":"ContainerStarted","Data":"fd5ec84edbcdd0e2de573ee9a6a36801e2ab73cfdf95f275698ed89e53f51f04"} Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.627793 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hmcq8" event={"ID":"a6096daf-c096-473d-b53c-07af5d99a7ee","Type":"ContainerStarted","Data":"abfcad23fb746d6faf7c9f42f55a9dca19ad824215d1de8e28c3c099a2874d8a"} Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.658632 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlsg2" podStartSLOduration=3.6138812639999998 podStartE2EDuration="10.65860841s" podCreationTimestamp="2025-10-02 11:23:33 +0000 UTC" firstStartedPulling="2025-10-02 11:23:35.560358021 +0000 UTC m=+816.110724385" lastFinishedPulling="2025-10-02 11:23:42.605085167 +0000 UTC m=+823.155451531" observedRunningTime="2025-10-02 11:23:43.648769679 +0000 UTC m=+824.199136083" watchObservedRunningTime="2025-10-02 11:23:43.65860841 +0000 UTC m=+824.208974804" Oct 02 11:23:43 crc kubenswrapper[4929]: I1002 11:23:43.840299 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5"] Oct 02 11:23:43 crc kubenswrapper[4929]: W1002 11:23:43.844417 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcde8b6a_6d5a_47a8_8d66_78bfc8934410.slice/crio-6f231617fad024af84ff2caf39eae0a6bc61d8b4087e1e402d5557fd0ef881af WatchSource:0}: Error finding container 6f231617fad024af84ff2caf39eae0a6bc61d8b4087e1e402d5557fd0ef881af: Status 404 returned error can't find the container with id 6f231617fad024af84ff2caf39eae0a6bc61d8b4087e1e402d5557fd0ef881af Oct 02 11:23:44 crc kubenswrapper[4929]: I1002 11:23:44.637491 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55fbcbdddd-nczbs" event={"ID":"db21a236-a1c3-4221-8f52-197d9ef69379","Type":"ContainerStarted","Data":"f6729658087d43b7a666a9dc1a2a7dc0af5146f93d25e6fb6f80f3c8e888bdde"} Oct 02 11:23:44 crc kubenswrapper[4929]: I1002 11:23:44.638682 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" event={"ID":"fcde8b6a-6d5a-47a8-8d66-78bfc8934410","Type":"ContainerStarted","Data":"6f231617fad024af84ff2caf39eae0a6bc61d8b4087e1e402d5557fd0ef881af"} Oct 02 11:23:44 crc kubenswrapper[4929]: I1002 11:23:44.668026 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55fbcbdddd-nczbs" podStartSLOduration=2.667994057 podStartE2EDuration="2.667994057s" podCreationTimestamp="2025-10-02 11:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:23:44.664140813 +0000 UTC m=+825.214507227" watchObservedRunningTime="2025-10-02 11:23:44.667994057 +0000 UTC m=+825.218360461" Oct 02 11:23:44 crc kubenswrapper[4929]: I1002 11:23:44.737643 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:23:44 crc kubenswrapper[4929]: I1002 11:23:44.737722 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.665002 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" event={"ID":"45290c56-c1df-4f53-a3ed-796fd6624bab","Type":"ContainerStarted","Data":"c4daeba4e842bde5fe87f1a86dfbaebef796d2027faeb2d1fa0a1ed4d7439802"} Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.665096 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.667505 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" event={"ID":"d1617fbb-7eac-484e-bf5f-ec3083165f47","Type":"ContainerStarted","Data":"d28aec41e78a0ba1b1f0948bbbc646c0c04d8c49f87c419b5c3a9dbfc8e161b3"} Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.669237 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" event={"ID":"fcde8b6a-6d5a-47a8-8d66-78bfc8934410","Type":"ContainerStarted","Data":"ba0b01937f19458bbd071a7863f4f8ed2d8fb52ba14f69212c0888fada258a46"} Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.671885 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hmcq8" event={"ID":"a6096daf-c096-473d-b53c-07af5d99a7ee","Type":"ContainerStarted","Data":"eb1af8ef0ea84d6a6e72f0855bd07734cb267bb2f0fa7c4323a864611eaffd02"} Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.671998 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.681483 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" podStartSLOduration=1.875408646 podStartE2EDuration="7.681465419s" podCreationTimestamp="2025-10-02 11:23:42 +0000 UTC" firstStartedPulling="2025-10-02 11:23:43.391813523 +0000 UTC m=+823.942179887" lastFinishedPulling="2025-10-02 11:23:49.197870296 +0000 UTC m=+829.748236660" observedRunningTime="2025-10-02 11:23:49.678089289 +0000 UTC m=+830.228455663" watchObservedRunningTime="2025-10-02 11:23:49.681465419 +0000 UTC m=+830.231831783" Oct 02 11:23:49 crc kubenswrapper[4929]: I1002 11:23:49.726157 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-n52n5" podStartSLOduration=2.371639784 podStartE2EDuration="7.726138681s" podCreationTimestamp="2025-10-02 11:23:42 +0000 UTC" firstStartedPulling="2025-10-02 11:23:43.846351357 +0000 UTC m=+824.396717721" lastFinishedPulling="2025-10-02 11:23:49.200850254 +0000 UTC m=+829.751216618" observedRunningTime="2025-10-02 11:23:49.70446891 +0000 UTC m=+830.254835294" watchObservedRunningTime="2025-10-02 11:23:49.726138681 +0000 UTC m=+830.276505045" Oct 02 11:23:50 crc kubenswrapper[4929]: I1002 11:23:50.176738 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hmcq8" podStartSLOduration=1.895724038 podStartE2EDuration="8.176716998s" podCreationTimestamp="2025-10-02 11:23:42 +0000 UTC" firstStartedPulling="2025-10-02 11:23:42.926224713 +0000 UTC m=+823.476591077" lastFinishedPulling="2025-10-02 11:23:49.207217673 +0000 UTC m=+829.757584037" observedRunningTime="2025-10-02 11:23:49.726526653 +0000 UTC m=+830.276893007" watchObservedRunningTime="2025-10-02 11:23:50.176716998 +0000 UTC m=+830.727083362" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.238547 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.239046 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.242685 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.581748 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.581801 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.646299 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.695896 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" event={"ID":"d1617fbb-7eac-484e-bf5f-ec3083165f47","Type":"ContainerStarted","Data":"9e90677aec8bac78233d56f614d8c49d2abbe74d4e096db94fc182f6dfbbb832"} Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.701373 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55fbcbdddd-nczbs" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.715276 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-xvd9j" podStartSLOduration=2.07284158 podStartE2EDuration="11.715255893s" podCreationTimestamp="2025-10-02 11:23:42 +0000 UTC" firstStartedPulling="2025-10-02 11:23:43.125589173 +0000 UTC m=+823.675955537" lastFinishedPulling="2025-10-02 11:23:52.768003486 +0000 UTC m=+833.318369850" observedRunningTime="2025-10-02 11:23:53.712743799 +0000 UTC m=+834.263110163" watchObservedRunningTime="2025-10-02 11:23:53.715255893 +0000 UTC m=+834.265622257" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.737762 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.766265 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:23:53 crc kubenswrapper[4929]: I1002 11:23:53.871589 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:55 crc kubenswrapper[4929]: I1002 11:23:55.708522 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlsg2" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="registry-server" containerID="cri-o://8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59" gracePeriod=2 Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.078650 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.157808 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content\") pod \"6417cdf0-d142-42b8-a604-a9ab007c92d3\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.157920 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdq64\" (UniqueName: \"kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64\") pod \"6417cdf0-d142-42b8-a604-a9ab007c92d3\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.157986 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities\") pod \"6417cdf0-d142-42b8-a604-a9ab007c92d3\" (UID: \"6417cdf0-d142-42b8-a604-a9ab007c92d3\") " Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.159315 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities" (OuterVolumeSpecName: "utilities") pod "6417cdf0-d142-42b8-a604-a9ab007c92d3" (UID: "6417cdf0-d142-42b8-a604-a9ab007c92d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.168685 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64" (OuterVolumeSpecName: "kube-api-access-gdq64") pod "6417cdf0-d142-42b8-a604-a9ab007c92d3" (UID: "6417cdf0-d142-42b8-a604-a9ab007c92d3"). InnerVolumeSpecName "kube-api-access-gdq64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.221680 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6417cdf0-d142-42b8-a604-a9ab007c92d3" (UID: "6417cdf0-d142-42b8-a604-a9ab007c92d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.259563 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.259631 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdq64\" (UniqueName: \"kubernetes.io/projected/6417cdf0-d142-42b8-a604-a9ab007c92d3-kube-api-access-gdq64\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.259651 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6417cdf0-d142-42b8-a604-a9ab007c92d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.720245 4929 generic.go:334] "Generic (PLEG): container finished" podID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerID="8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59" exitCode=0 Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.720324 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerDied","Data":"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59"} Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.720347 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlsg2" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.720377 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlsg2" event={"ID":"6417cdf0-d142-42b8-a604-a9ab007c92d3","Type":"ContainerDied","Data":"4e2e445cf71cce82a03b2b718328a79916d0c7efd323dc6ae236a51650019b56"} Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.720417 4929 scope.go:117] "RemoveContainer" containerID="8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.739612 4929 scope.go:117] "RemoveContainer" containerID="f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.759147 4929 scope.go:117] "RemoveContainer" containerID="ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.783633 4929 scope.go:117] "RemoveContainer" containerID="8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59" Oct 02 11:23:56 crc kubenswrapper[4929]: E1002 11:23:56.784062 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59\": container with ID starting with 8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59 not found: ID does not exist" containerID="8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784098 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59"} err="failed to get container status \"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59\": rpc error: code = NotFound desc = could not find container \"8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59\": container with ID starting with 8a630b129e5916f06c93da7d29afec0343f2b43f96b8e612793de4e703e06e59 not found: ID does not exist" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784125 4929 scope.go:117] "RemoveContainer" containerID="f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b" Oct 02 11:23:56 crc kubenswrapper[4929]: E1002 11:23:56.784405 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b\": container with ID starting with f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b not found: ID does not exist" containerID="f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784448 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b"} err="failed to get container status \"f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b\": rpc error: code = NotFound desc = could not find container \"f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b\": container with ID starting with f5ec1addc79f6e85c00dc1832d116aa0475a6f01780d7cb7b33b087f3234465b not found: ID does not exist" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784476 4929 scope.go:117] "RemoveContainer" containerID="ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5" Oct 02 11:23:56 crc kubenswrapper[4929]: E1002 11:23:56.784765 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5\": container with ID starting with ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5 not found: ID does not exist" containerID="ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784800 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5"} err="failed to get container status \"ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5\": rpc error: code = NotFound desc = could not find container \"ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5\": container with ID starting with ecddee0c0ba6991ab69c674b7462549a1e5109ced1dee1136982caabfb6548a5 not found: ID does not exist" Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.784866 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:56 crc kubenswrapper[4929]: I1002 11:23:56.788039 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlsg2"] Oct 02 11:23:57 crc kubenswrapper[4929]: I1002 11:23:57.897414 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hmcq8" Oct 02 11:23:58 crc kubenswrapper[4929]: I1002 11:23:58.167743 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" path="/var/lib/kubelet/pods/6417cdf0-d142-42b8-a604-a9ab007c92d3/volumes" Oct 02 11:24:02 crc kubenswrapper[4929]: I1002 11:24:02.863901 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-q7xjr" Oct 02 11:24:14 crc kubenswrapper[4929]: I1002 11:24:14.737258 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:24:14 crc kubenswrapper[4929]: I1002 11:24:14.737865 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:24:14 crc kubenswrapper[4929]: I1002 11:24:14.737916 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:24:14 crc kubenswrapper[4929]: I1002 11:24:14.738570 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:24:14 crc kubenswrapper[4929]: I1002 11:24:14.738635 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0" gracePeriod=600 Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.028743 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm"] Oct 02 11:24:15 crc kubenswrapper[4929]: E1002 11:24:15.029245 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="extract-content" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.029262 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="extract-content" Oct 02 11:24:15 crc kubenswrapper[4929]: E1002 11:24:15.029287 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="registry-server" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.029298 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="registry-server" Oct 02 11:24:15 crc kubenswrapper[4929]: E1002 11:24:15.029311 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="extract-utilities" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.029320 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="extract-utilities" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.029438 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6417cdf0-d142-42b8-a604-a9ab007c92d3" containerName="registry-server" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.030302 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.032641 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.043351 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm"] Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.132519 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.132596 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.132636 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdnw\" (UniqueName: \"kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.235258 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.235434 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdnw\" (UniqueName: \"kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.235686 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.235686 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.236634 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.266847 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdnw\" (UniqueName: \"kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.363836 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.837157 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0" exitCode=0 Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.837283 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0"} Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.837500 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616"} Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.837525 4929 scope.go:117] "RemoveContainer" containerID="734030d7b32aee89aaf1f696dd592d8b76828337b0add53e8b02a123d5ff922c" Oct 02 11:24:15 crc kubenswrapper[4929]: I1002 11:24:15.861112 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm"] Oct 02 11:24:16 crc kubenswrapper[4929]: I1002 11:24:16.848096 4929 generic.go:334] "Generic (PLEG): container finished" podID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerID="04f0c3a581d4190cc86e5e998f086752076b1d76a1de0952c2309bbc8bec979d" exitCode=0 Oct 02 11:24:16 crc kubenswrapper[4929]: I1002 11:24:16.848165 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" event={"ID":"9c86c553-24cc-44e5-ae1d-0b91e5e44c88","Type":"ContainerDied","Data":"04f0c3a581d4190cc86e5e998f086752076b1d76a1de0952c2309bbc8bec979d"} Oct 02 11:24:16 crc kubenswrapper[4929]: I1002 11:24:16.848556 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" event={"ID":"9c86c553-24cc-44e5-ae1d-0b91e5e44c88","Type":"ContainerStarted","Data":"17f37a6a9b886780195c2410409a72e06e7164634551168916449d13c042daaf"} Oct 02 11:24:18 crc kubenswrapper[4929]: I1002 11:24:18.808184 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zc6nf" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerName="console" containerID="cri-o://3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6" gracePeriod=15 Oct 02 11:24:18 crc kubenswrapper[4929]: I1002 11:24:18.860351 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" event={"ID":"9c86c553-24cc-44e5-ae1d-0b91e5e44c88","Type":"ContainerDied","Data":"0929d23445e134f1e4b6190cf556255397fa5664bd117a039b33717369c357f3"} Oct 02 11:24:18 crc kubenswrapper[4929]: I1002 11:24:18.860094 4929 generic.go:334] "Generic (PLEG): container finished" podID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerID="0929d23445e134f1e4b6190cf556255397fa5664bd117a039b33717369c357f3" exitCode=0 Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.191721 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zc6nf_c446dd7b-73fd-4b60-91d9-f1b74df3b69a/console/0.log" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.192043 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300070 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300141 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300217 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300240 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300277 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300310 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjhxs\" (UniqueName: \"kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.300340 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config\") pod \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\" (UID: \"c446dd7b-73fd-4b60-91d9-f1b74df3b69a\") " Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.301097 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.301276 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.301333 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config" (OuterVolumeSpecName: "console-config") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.301481 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca" (OuterVolumeSpecName: "service-ca") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.305704 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs" (OuterVolumeSpecName: "kube-api-access-sjhxs") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "kube-api-access-sjhxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.306188 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.306589 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c446dd7b-73fd-4b60-91d9-f1b74df3b69a" (UID: "c446dd7b-73fd-4b60-91d9-f1b74df3b69a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.401900 4929 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402196 4929 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402208 4929 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402217 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjhxs\" (UniqueName: \"kubernetes.io/projected/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-kube-api-access-sjhxs\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402225 4929 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402233 4929 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.402240 4929 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c446dd7b-73fd-4b60-91d9-f1b74df3b69a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869650 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zc6nf_c446dd7b-73fd-4b60-91d9-f1b74df3b69a/console/0.log" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869761 4929 generic.go:334] "Generic (PLEG): container finished" podID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerID="3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6" exitCode=2 Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869880 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc6nf" event={"ID":"c446dd7b-73fd-4b60-91d9-f1b74df3b69a","Type":"ContainerDied","Data":"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6"} Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869936 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zc6nf" event={"ID":"c446dd7b-73fd-4b60-91d9-f1b74df3b69a","Type":"ContainerDied","Data":"d3d589d05f837e02e76f3ccfc0dd0218e32d25fab44feb970b3adb01773660b3"} Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869985 4929 scope.go:117] "RemoveContainer" containerID="3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.869998 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zc6nf" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.873166 4929 generic.go:334] "Generic (PLEG): container finished" podID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerID="ab515298a94ca56b60e4b6b0c2c96a5493452fb4a1fb34030c9d60f0ac3a0267" exitCode=0 Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.873202 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" event={"ID":"9c86c553-24cc-44e5-ae1d-0b91e5e44c88","Type":"ContainerDied","Data":"ab515298a94ca56b60e4b6b0c2c96a5493452fb4a1fb34030c9d60f0ac3a0267"} Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.885086 4929 scope.go:117] "RemoveContainer" containerID="3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6" Oct 02 11:24:19 crc kubenswrapper[4929]: E1002 11:24:19.885615 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6\": container with ID starting with 3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6 not found: ID does not exist" containerID="3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.885673 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6"} err="failed to get container status \"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6\": rpc error: code = NotFound desc = could not find container \"3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6\": container with ID starting with 3e61aefb0bc5e082fcfb719bd89996c9c11cf79ce9998a7c265d2d0f42aa4ea6 not found: ID does not exist" Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.910871 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:24:19 crc kubenswrapper[4929]: I1002 11:24:19.914147 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zc6nf"] Oct 02 11:24:20 crc kubenswrapper[4929]: I1002 11:24:20.172521 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" path="/var/lib/kubelet/pods/c446dd7b-73fd-4b60-91d9-f1b74df3b69a/volumes" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.207180 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.327403 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdnw\" (UniqueName: \"kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw\") pod \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.327491 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util\") pod \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.327528 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle\") pod \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\" (UID: \"9c86c553-24cc-44e5-ae1d-0b91e5e44c88\") " Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.328486 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle" (OuterVolumeSpecName: "bundle") pod "9c86c553-24cc-44e5-ae1d-0b91e5e44c88" (UID: "9c86c553-24cc-44e5-ae1d-0b91e5e44c88"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.337129 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw" (OuterVolumeSpecName: "kube-api-access-zfdnw") pod "9c86c553-24cc-44e5-ae1d-0b91e5e44c88" (UID: "9c86c553-24cc-44e5-ae1d-0b91e5e44c88"). InnerVolumeSpecName "kube-api-access-zfdnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.340440 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util" (OuterVolumeSpecName: "util") pod "9c86c553-24cc-44e5-ae1d-0b91e5e44c88" (UID: "9c86c553-24cc-44e5-ae1d-0b91e5e44c88"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.429252 4929 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.429282 4929 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.429292 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdnw\" (UniqueName: \"kubernetes.io/projected/9c86c553-24cc-44e5-ae1d-0b91e5e44c88-kube-api-access-zfdnw\") on node \"crc\" DevicePath \"\"" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.888454 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" event={"ID":"9c86c553-24cc-44e5-ae1d-0b91e5e44c88","Type":"ContainerDied","Data":"17f37a6a9b886780195c2410409a72e06e7164634551168916449d13c042daaf"} Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.888494 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f37a6a9b886780195c2410409a72e06e7164634551168916449d13c042daaf" Oct 02 11:24:21 crc kubenswrapper[4929]: I1002 11:24:21.888498 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.015141 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4"] Oct 02 11:24:30 crc kubenswrapper[4929]: E1002 11:24:30.016014 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerName="console" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016027 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerName="console" Oct 02 11:24:30 crc kubenswrapper[4929]: E1002 11:24:30.016041 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="pull" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016048 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="pull" Oct 02 11:24:30 crc kubenswrapper[4929]: E1002 11:24:30.016059 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="util" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016067 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="util" Oct 02 11:24:30 crc kubenswrapper[4929]: E1002 11:24:30.016089 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="extract" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016096 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="extract" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016235 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c446dd7b-73fd-4b60-91d9-f1b74df3b69a" containerName="console" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016246 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c86c553-24cc-44e5-ae1d-0b91e5e44c88" containerName="extract" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.016700 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.025393 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.025719 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.026317 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.026572 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-76vpx" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.026600 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.038924 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4"] Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.143466 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pqm9\" (UniqueName: \"kubernetes.io/projected/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-kube-api-access-6pqm9\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.143533 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-webhook-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.143721 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-apiservice-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.245487 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-apiservice-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.245605 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pqm9\" (UniqueName: \"kubernetes.io/projected/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-kube-api-access-6pqm9\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.245641 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-webhook-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.251735 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-apiservice-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.252595 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-webhook-cert\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.266112 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pqm9\" (UniqueName: \"kubernetes.io/projected/a5e4ea95-81a8-42c4-aa5a-53a414c78b13-kube-api-access-6pqm9\") pod \"metallb-operator-controller-manager-77d94b9d4d-669p4\" (UID: \"a5e4ea95-81a8-42c4-aa5a-53a414c78b13\") " pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.337780 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.382056 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf"] Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.393628 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.398352 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.398374 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.398737 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-frwlt" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.419217 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf"] Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.550148 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-webhook-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.550438 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-apiservice-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.550467 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lgcx\" (UniqueName: \"kubernetes.io/projected/5bc6808d-8e32-418c-b479-36e879e768d1-kube-api-access-9lgcx\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.653040 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-webhook-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.653119 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-apiservice-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.653208 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lgcx\" (UniqueName: \"kubernetes.io/projected/5bc6808d-8e32-418c-b479-36e879e768d1-kube-api-access-9lgcx\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.658120 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-webhook-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.672870 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5bc6808d-8e32-418c-b479-36e879e768d1-apiservice-cert\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.682791 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lgcx\" (UniqueName: \"kubernetes.io/projected/5bc6808d-8e32-418c-b479-36e879e768d1-kube-api-access-9lgcx\") pod \"metallb-operator-webhook-server-685ffc5d48-q7qgf\" (UID: \"5bc6808d-8e32-418c-b479-36e879e768d1\") " pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.719606 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.894718 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4"] Oct 02 11:24:30 crc kubenswrapper[4929]: W1002 11:24:30.912977 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e4ea95_81a8_42c4_aa5a_53a414c78b13.slice/crio-3c02ce69f90515419018ed7cf8a60c2becdb843f1b17482eb63f6d9269c600b1 WatchSource:0}: Error finding container 3c02ce69f90515419018ed7cf8a60c2becdb843f1b17482eb63f6d9269c600b1: Status 404 returned error can't find the container with id 3c02ce69f90515419018ed7cf8a60c2becdb843f1b17482eb63f6d9269c600b1 Oct 02 11:24:30 crc kubenswrapper[4929]: I1002 11:24:30.938815 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" event={"ID":"a5e4ea95-81a8-42c4-aa5a-53a414c78b13","Type":"ContainerStarted","Data":"3c02ce69f90515419018ed7cf8a60c2becdb843f1b17482eb63f6d9269c600b1"} Oct 02 11:24:31 crc kubenswrapper[4929]: I1002 11:24:31.109751 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf"] Oct 02 11:24:31 crc kubenswrapper[4929]: W1002 11:24:31.118156 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc6808d_8e32_418c_b479_36e879e768d1.slice/crio-0606cbf9959c97c46e7e3239fac787a64794ce0f4b6a2835e1a50d01f0ab263d WatchSource:0}: Error finding container 0606cbf9959c97c46e7e3239fac787a64794ce0f4b6a2835e1a50d01f0ab263d: Status 404 returned error can't find the container with id 0606cbf9959c97c46e7e3239fac787a64794ce0f4b6a2835e1a50d01f0ab263d Oct 02 11:24:31 crc kubenswrapper[4929]: I1002 11:24:31.945822 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" event={"ID":"5bc6808d-8e32-418c-b479-36e879e768d1","Type":"ContainerStarted","Data":"0606cbf9959c97c46e7e3239fac787a64794ce0f4b6a2835e1a50d01f0ab263d"} Oct 02 11:24:36 crc kubenswrapper[4929]: I1002 11:24:36.990665 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" event={"ID":"a5e4ea95-81a8-42c4-aa5a-53a414c78b13","Type":"ContainerStarted","Data":"67db88e3df20a3868e63b9e7df17bbefb75cdc2fa63941fcd1d6919f3baaa66d"} Oct 02 11:24:36 crc kubenswrapper[4929]: I1002 11:24:36.991261 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:24:36 crc kubenswrapper[4929]: I1002 11:24:36.992573 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" event={"ID":"5bc6808d-8e32-418c-b479-36e879e768d1","Type":"ContainerStarted","Data":"6a6dcf3fc23c19f27c49777901485bd53b346ed68caec7ed8f12ef913c48cee9"} Oct 02 11:24:36 crc kubenswrapper[4929]: I1002 11:24:36.992731 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:24:37 crc kubenswrapper[4929]: I1002 11:24:37.018481 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" podStartSLOduration=2.419151124 podStartE2EDuration="8.018462545s" podCreationTimestamp="2025-10-02 11:24:29 +0000 UTC" firstStartedPulling="2025-10-02 11:24:30.915530627 +0000 UTC m=+871.465896991" lastFinishedPulling="2025-10-02 11:24:36.514842048 +0000 UTC m=+877.065208412" observedRunningTime="2025-10-02 11:24:37.014413825 +0000 UTC m=+877.564780199" watchObservedRunningTime="2025-10-02 11:24:37.018462545 +0000 UTC m=+877.568828919" Oct 02 11:24:37 crc kubenswrapper[4929]: I1002 11:24:37.040579 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" podStartSLOduration=1.619978229 podStartE2EDuration="7.040556589s" podCreationTimestamp="2025-10-02 11:24:30 +0000 UTC" firstStartedPulling="2025-10-02 11:24:31.122003169 +0000 UTC m=+871.672369533" lastFinishedPulling="2025-10-02 11:24:36.542581519 +0000 UTC m=+877.092947893" observedRunningTime="2025-10-02 11:24:37.038079295 +0000 UTC m=+877.588445669" watchObservedRunningTime="2025-10-02 11:24:37.040556589 +0000 UTC m=+877.590922963" Oct 02 11:24:50 crc kubenswrapper[4929]: I1002 11:24:50.722856 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-685ffc5d48-q7qgf" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.296534 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.298185 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.301646 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.301707 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.301780 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpcft\" (UniqueName: \"kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.307970 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.403302 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.403359 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.403385 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpcft\" (UniqueName: \"kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.403824 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.403885 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.436918 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpcft\" (UniqueName: \"kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft\") pod \"redhat-marketplace-mcgd5\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.617090 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:03 crc kubenswrapper[4929]: I1002 11:25:03.851656 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:04 crc kubenswrapper[4929]: I1002 11:25:04.144699 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerStarted","Data":"4453aae05f774690727a9261095288bb2bea396ca7e64baa31d2e98fe005da27"} Oct 02 11:25:04 crc kubenswrapper[4929]: I1002 11:25:04.145036 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerStarted","Data":"a1b76c9751f7f9293e4217a430cad6bf7b1b1d8f7bc518d219b3d8d85f7939a1"} Oct 02 11:25:05 crc kubenswrapper[4929]: I1002 11:25:05.150985 4929 generic.go:334] "Generic (PLEG): container finished" podID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerID="4453aae05f774690727a9261095288bb2bea396ca7e64baa31d2e98fe005da27" exitCode=0 Oct 02 11:25:05 crc kubenswrapper[4929]: I1002 11:25:05.151045 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerDied","Data":"4453aae05f774690727a9261095288bb2bea396ca7e64baa31d2e98fe005da27"} Oct 02 11:25:07 crc kubenswrapper[4929]: I1002 11:25:07.162982 4929 generic.go:334] "Generic (PLEG): container finished" podID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerID="3759552204ab3387fe9b54a585faa55a91a43b1fc82cb958d4d1fc079a89c7f2" exitCode=0 Oct 02 11:25:07 crc kubenswrapper[4929]: I1002 11:25:07.163031 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerDied","Data":"3759552204ab3387fe9b54a585faa55a91a43b1fc82cb958d4d1fc079a89c7f2"} Oct 02 11:25:08 crc kubenswrapper[4929]: I1002 11:25:08.169808 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerStarted","Data":"e6cec318692f10ff8aa96f2521d58df493897d327978602840b80c032f486561"} Oct 02 11:25:08 crc kubenswrapper[4929]: I1002 11:25:08.196794 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mcgd5" podStartSLOduration=2.456521061 podStartE2EDuration="5.19677823s" podCreationTimestamp="2025-10-02 11:25:03 +0000 UTC" firstStartedPulling="2025-10-02 11:25:05.152847594 +0000 UTC m=+905.703213958" lastFinishedPulling="2025-10-02 11:25:07.893104763 +0000 UTC m=+908.443471127" observedRunningTime="2025-10-02 11:25:08.194329318 +0000 UTC m=+908.744695702" watchObservedRunningTime="2025-10-02 11:25:08.19677823 +0000 UTC m=+908.747144594" Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.340043 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-77d94b9d4d-669p4" Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.992053 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k"] Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.992766 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.994877 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.995288 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mnbch"] Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.997702 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:10 crc kubenswrapper[4929]: I1002 11:25:10.998325 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-k9v4q" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:10.999976 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.000179 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.054894 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k"] Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.080092 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2j7bg"] Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.080913 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.085130 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-94scc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.085295 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.085415 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.085532 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.098543 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-rbghc"] Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.099616 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.101333 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.111301 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rbghc"] Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193449 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsc9m\" (UniqueName: \"kubernetes.io/projected/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-kube-api-access-jsc9m\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193765 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shdjv\" (UniqueName: \"kubernetes.io/projected/31f871c6-7f36-48ee-b3d0-bc4913874419-kube-api-access-shdjv\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193783 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjq88\" (UniqueName: \"kubernetes.io/projected/81d9a392-bd1f-4577-a81f-df9fa22dbdec-kube-api-access-vjq88\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193801 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/501c2d93-5384-46e1-a709-881d7e6eb442-metallb-excludel2\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193820 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-sockets\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193835 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-cert\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193861 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-reloader\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193875 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-metrics-certs\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193890 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-conf\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193907 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d9a392-bd1f-4577-a81f-df9fa22dbdec-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193933 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193947 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-startup\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.193983 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grgr\" (UniqueName: \"kubernetes.io/projected/501c2d93-5384-46e1-a709-881d7e6eb442-kube-api-access-7grgr\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.194005 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.194065 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.194085 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics-certs\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294751 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsc9m\" (UniqueName: \"kubernetes.io/projected/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-kube-api-access-jsc9m\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294816 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shdjv\" (UniqueName: \"kubernetes.io/projected/31f871c6-7f36-48ee-b3d0-bc4913874419-kube-api-access-shdjv\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294843 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjq88\" (UniqueName: \"kubernetes.io/projected/81d9a392-bd1f-4577-a81f-df9fa22dbdec-kube-api-access-vjq88\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294870 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/501c2d93-5384-46e1-a709-881d7e6eb442-metallb-excludel2\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294895 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-sockets\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294919 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-cert\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.294974 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-reloader\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295000 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-metrics-certs\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295024 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-conf\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295052 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d9a392-bd1f-4577-a81f-df9fa22dbdec-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295102 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295125 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-startup\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295147 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7grgr\" (UniqueName: \"kubernetes.io/projected/501c2d93-5384-46e1-a709-881d7e6eb442-kube-api-access-7grgr\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295169 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295201 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295233 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics-certs\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.295305 4929 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.295370 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist podName:501c2d93-5384-46e1-a709-881d7e6eb442 nodeName:}" failed. No retries permitted until 2025-10-02 11:25:11.795348935 +0000 UTC m=+912.345715299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist") pod "speaker-2j7bg" (UID: "501c2d93-5384-46e1-a709-881d7e6eb442") : secret "metallb-memberlist" not found Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295449 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-reloader\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.295539 4929 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.295575 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs podName:501c2d93-5384-46e1-a709-881d7e6eb442 nodeName:}" failed. No retries permitted until 2025-10-02 11:25:11.795563761 +0000 UTC m=+912.345930125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs") pod "speaker-2j7bg" (UID: "501c2d93-5384-46e1-a709-881d7e6eb442") : secret "speaker-certs-secret" not found Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295828 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-conf\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.295860 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/501c2d93-5384-46e1-a709-881d7e6eb442-metallb-excludel2\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.296678 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-startup\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.297024 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-frr-sockets\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.297987 4929 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.310573 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-metrics-certs\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.310593 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-metrics-certs\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.310992 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31f871c6-7f36-48ee-b3d0-bc4913874419-cert\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.311357 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81d9a392-bd1f-4577-a81f-df9fa22dbdec-cert\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.323615 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjq88\" (UniqueName: \"kubernetes.io/projected/81d9a392-bd1f-4577-a81f-df9fa22dbdec-kube-api-access-vjq88\") pod \"frr-k8s-webhook-server-64bf5d555-9l74k\" (UID: \"81d9a392-bd1f-4577-a81f-df9fa22dbdec\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.324832 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsc9m\" (UniqueName: \"kubernetes.io/projected/f8f57c23-45b1-47b0-bc42-33dc9b2f1f53-kube-api-access-jsc9m\") pod \"frr-k8s-mnbch\" (UID: \"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53\") " pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.325196 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grgr\" (UniqueName: \"kubernetes.io/projected/501c2d93-5384-46e1-a709-881d7e6eb442-kube-api-access-7grgr\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.325360 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shdjv\" (UniqueName: \"kubernetes.io/projected/31f871c6-7f36-48ee-b3d0-bc4913874419-kube-api-access-shdjv\") pod \"controller-68d546b9d8-rbghc\" (UID: \"31f871c6-7f36-48ee-b3d0-bc4913874419\") " pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.414725 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.608745 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.620256 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.779407 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k"] Oct 02 11:25:11 crc kubenswrapper[4929]: W1002 11:25:11.784465 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d9a392_bd1f_4577_a81f_df9fa22dbdec.slice/crio-850b4e3b109207b852462d50edd521173288657361a185a40afd58d1b1008ed5 WatchSource:0}: Error finding container 850b4e3b109207b852462d50edd521173288657361a185a40afd58d1b1008ed5: Status 404 returned error can't find the container with id 850b4e3b109207b852462d50edd521173288657361a185a40afd58d1b1008ed5 Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.791402 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rbghc"] Oct 02 11:25:11 crc kubenswrapper[4929]: W1002 11:25:11.799280 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f871c6_7f36_48ee_b3d0_bc4913874419.slice/crio-1b4210f492116bf72e1718b20dd5f61d9504b852d531e1d1c4abce10733422ca WatchSource:0}: Error finding container 1b4210f492116bf72e1718b20dd5f61d9504b852d531e1d1c4abce10733422ca: Status 404 returned error can't find the container with id 1b4210f492116bf72e1718b20dd5f61d9504b852d531e1d1c4abce10733422ca Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.799873 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.799914 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.800037 4929 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:25:11 crc kubenswrapper[4929]: E1002 11:25:11.800120 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist podName:501c2d93-5384-46e1-a709-881d7e6eb442 nodeName:}" failed. No retries permitted until 2025-10-02 11:25:12.800100462 +0000 UTC m=+913.350466826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist") pod "speaker-2j7bg" (UID: "501c2d93-5384-46e1-a709-881d7e6eb442") : secret "metallb-memberlist" not found Oct 02 11:25:11 crc kubenswrapper[4929]: I1002 11:25:11.805465 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-metrics-certs\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:12 crc kubenswrapper[4929]: I1002 11:25:12.192145 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rbghc" event={"ID":"31f871c6-7f36-48ee-b3d0-bc4913874419","Type":"ContainerStarted","Data":"1b4210f492116bf72e1718b20dd5f61d9504b852d531e1d1c4abce10733422ca"} Oct 02 11:25:12 crc kubenswrapper[4929]: I1002 11:25:12.193928 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" event={"ID":"81d9a392-bd1f-4577-a81f-df9fa22dbdec","Type":"ContainerStarted","Data":"850b4e3b109207b852462d50edd521173288657361a185a40afd58d1b1008ed5"} Oct 02 11:25:12 crc kubenswrapper[4929]: I1002 11:25:12.812665 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:12 crc kubenswrapper[4929]: I1002 11:25:12.818420 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/501c2d93-5384-46e1-a709-881d7e6eb442-memberlist\") pod \"speaker-2j7bg\" (UID: \"501c2d93-5384-46e1-a709-881d7e6eb442\") " pod="metallb-system/speaker-2j7bg" Oct 02 11:25:12 crc kubenswrapper[4929]: I1002 11:25:12.897081 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2j7bg" Oct 02 11:25:12 crc kubenswrapper[4929]: W1002 11:25:12.925006 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501c2d93_5384_46e1_a709_881d7e6eb442.slice/crio-7c183860ca7c56982792e3231ba30048c88e326b04fe079542402babd1c715b4 WatchSource:0}: Error finding container 7c183860ca7c56982792e3231ba30048c88e326b04fe079542402babd1c715b4: Status 404 returned error can't find the container with id 7c183860ca7c56982792e3231ba30048c88e326b04fe079542402babd1c715b4 Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.200842 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2j7bg" event={"ID":"501c2d93-5384-46e1-a709-881d7e6eb442","Type":"ContainerStarted","Data":"7c183860ca7c56982792e3231ba30048c88e326b04fe079542402babd1c715b4"} Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.202819 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"60cb0dc949dcc5d1403aedf0a1a8f0dfbbb18ad874a36943ab60e526e258eb7f"} Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.205773 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rbghc" event={"ID":"31f871c6-7f36-48ee-b3d0-bc4913874419","Type":"ContainerStarted","Data":"f42800e2138a4f751aa5b3748677536f10f5c703fea8042b715b2a8063049fd5"} Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.617832 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.618153 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:13 crc kubenswrapper[4929]: I1002 11:25:13.689398 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.215413 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rbghc" event={"ID":"31f871c6-7f36-48ee-b3d0-bc4913874419","Type":"ContainerStarted","Data":"827e0e283d3bab2bce82b0f1915bb61336fcafd78b65e2cfd49a4496b4efd338"} Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.216409 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.222589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2j7bg" event={"ID":"501c2d93-5384-46e1-a709-881d7e6eb442","Type":"ContainerStarted","Data":"bc2ece2449ebf1fd7c67012a85c5f6c57b8f28d627b88d4d40d0fe00ca1d8484"} Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.222619 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2j7bg" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.222630 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2j7bg" event={"ID":"501c2d93-5384-46e1-a709-881d7e6eb442","Type":"ContainerStarted","Data":"6494d41374e04b2a367e5c554edfb56a6aed2ac19206306e19c8e00397a8b986"} Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.242474 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-rbghc" podStartSLOduration=3.242460576 podStartE2EDuration="3.242460576s" podCreationTimestamp="2025-10-02 11:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:25:14.241363595 +0000 UTC m=+914.791729979" watchObservedRunningTime="2025-10-02 11:25:14.242460576 +0000 UTC m=+914.792826940" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.277728 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.296511 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2j7bg" podStartSLOduration=3.296490116 podStartE2EDuration="3.296490116s" podCreationTimestamp="2025-10-02 11:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:25:14.263494132 +0000 UTC m=+914.813860496" watchObservedRunningTime="2025-10-02 11:25:14.296490116 +0000 UTC m=+914.846856480" Oct 02 11:25:14 crc kubenswrapper[4929]: I1002 11:25:14.322594 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.234717 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mcgd5" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="registry-server" containerID="cri-o://e6cec318692f10ff8aa96f2521d58df493897d327978602840b80c032f486561" gracePeriod=2 Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.324874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.326742 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.338484 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.462787 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.462896 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwl4\" (UniqueName: \"kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.462950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.564573 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.564629 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.564700 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwl4\" (UniqueName: \"kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.565639 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.565721 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.597729 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwl4\" (UniqueName: \"kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4\") pod \"community-operators-4625c\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:16 crc kubenswrapper[4929]: I1002 11:25:16.662269 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:17 crc kubenswrapper[4929]: I1002 11:25:17.212453 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:17 crc kubenswrapper[4929]: I1002 11:25:17.259245 4929 generic.go:334] "Generic (PLEG): container finished" podID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerID="e6cec318692f10ff8aa96f2521d58df493897d327978602840b80c032f486561" exitCode=0 Oct 02 11:25:17 crc kubenswrapper[4929]: I1002 11:25:17.259276 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerDied","Data":"e6cec318692f10ff8aa96f2521d58df493897d327978602840b80c032f486561"} Oct 02 11:25:17 crc kubenswrapper[4929]: I1002 11:25:17.263472 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerStarted","Data":"74149e233268472e71be0da0681616755974cdd18c00d84c36650479347e1bd9"} Oct 02 11:25:18 crc kubenswrapper[4929]: I1002 11:25:18.271410 4929 generic.go:334] "Generic (PLEG): container finished" podID="183c05c9-db31-4c65-9731-bb5529963bdd" containerID="598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158" exitCode=0 Oct 02 11:25:18 crc kubenswrapper[4929]: I1002 11:25:18.271724 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerDied","Data":"598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158"} Oct 02 11:25:19 crc kubenswrapper[4929]: I1002 11:25:19.873563 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.021831 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpcft\" (UniqueName: \"kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft\") pod \"3228ff9a-1b88-458e-b668-2724a3c4f19e\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.022317 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content\") pod \"3228ff9a-1b88-458e-b668-2724a3c4f19e\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.022356 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities\") pod \"3228ff9a-1b88-458e-b668-2724a3c4f19e\" (UID: \"3228ff9a-1b88-458e-b668-2724a3c4f19e\") " Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.023813 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities" (OuterVolumeSpecName: "utilities") pod "3228ff9a-1b88-458e-b668-2724a3c4f19e" (UID: "3228ff9a-1b88-458e-b668-2724a3c4f19e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.036518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft" (OuterVolumeSpecName: "kube-api-access-mpcft") pod "3228ff9a-1b88-458e-b668-2724a3c4f19e" (UID: "3228ff9a-1b88-458e-b668-2724a3c4f19e"). InnerVolumeSpecName "kube-api-access-mpcft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.041625 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3228ff9a-1b88-458e-b668-2724a3c4f19e" (UID: "3228ff9a-1b88-458e-b668-2724a3c4f19e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.124117 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.124156 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3228ff9a-1b88-458e-b668-2724a3c4f19e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.124180 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpcft\" (UniqueName: \"kubernetes.io/projected/3228ff9a-1b88-458e-b668-2724a3c4f19e-kube-api-access-mpcft\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.282633 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mcgd5" event={"ID":"3228ff9a-1b88-458e-b668-2724a3c4f19e","Type":"ContainerDied","Data":"a1b76c9751f7f9293e4217a430cad6bf7b1b1d8f7bc518d219b3d8d85f7939a1"} Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.282678 4929 scope.go:117] "RemoveContainer" containerID="e6cec318692f10ff8aa96f2521d58df493897d327978602840b80c032f486561" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.282730 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mcgd5" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.300225 4929 scope.go:117] "RemoveContainer" containerID="3759552204ab3387fe9b54a585faa55a91a43b1fc82cb958d4d1fc079a89c7f2" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.307708 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.314585 4929 scope.go:117] "RemoveContainer" containerID="4453aae05f774690727a9261095288bb2bea396ca7e64baa31d2e98fe005da27" Oct 02 11:25:20 crc kubenswrapper[4929]: I1002 11:25:20.317066 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mcgd5"] Oct 02 11:25:22 crc kubenswrapper[4929]: I1002 11:25:22.162806 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" path="/var/lib/kubelet/pods/3228ff9a-1b88-458e-b668-2724a3c4f19e/volumes" Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.318844 4929 generic.go:334] "Generic (PLEG): container finished" podID="f8f57c23-45b1-47b0-bc42-33dc9b2f1f53" containerID="219cbca2315d7299424ac4ff5f81560fb8af927062c3c6f7945be22d53c18649" exitCode=0 Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.318924 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerDied","Data":"219cbca2315d7299424ac4ff5f81560fb8af927062c3c6f7945be22d53c18649"} Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.323900 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" event={"ID":"81d9a392-bd1f-4577-a81f-df9fa22dbdec","Type":"ContainerStarted","Data":"f301c3ede1ba3d4692733123259e2c09ae2810f7d5dec3e3e6d81d78eeb0f2c8"} Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.323998 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.328016 4929 generic.go:334] "Generic (PLEG): container finished" podID="183c05c9-db31-4c65-9731-bb5529963bdd" containerID="15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847" exitCode=0 Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.328083 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerDied","Data":"15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847"} Oct 02 11:25:25 crc kubenswrapper[4929]: I1002 11:25:25.371839 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" podStartSLOduration=2.940004881 podStartE2EDuration="15.371822838s" podCreationTimestamp="2025-10-02 11:25:10 +0000 UTC" firstStartedPulling="2025-10-02 11:25:11.786526047 +0000 UTC m=+912.336892421" lastFinishedPulling="2025-10-02 11:25:24.218344014 +0000 UTC m=+924.768710378" observedRunningTime="2025-10-02 11:25:25.370720407 +0000 UTC m=+925.921086771" watchObservedRunningTime="2025-10-02 11:25:25.371822838 +0000 UTC m=+925.922189202" Oct 02 11:25:26 crc kubenswrapper[4929]: I1002 11:25:26.338466 4929 generic.go:334] "Generic (PLEG): container finished" podID="f8f57c23-45b1-47b0-bc42-33dc9b2f1f53" containerID="1a86184e7e1f8b781fbefeba7b3f3e51e53d2494433807377c3d955a1b55d70f" exitCode=0 Oct 02 11:25:26 crc kubenswrapper[4929]: I1002 11:25:26.339070 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerDied","Data":"1a86184e7e1f8b781fbefeba7b3f3e51e53d2494433807377c3d955a1b55d70f"} Oct 02 11:25:27 crc kubenswrapper[4929]: I1002 11:25:27.348822 4929 generic.go:334] "Generic (PLEG): container finished" podID="f8f57c23-45b1-47b0-bc42-33dc9b2f1f53" containerID="9324c1e00528cbd0cc10d4705505a64c2b0c451ccd2ba35957ed7c54b8710aa7" exitCode=0 Oct 02 11:25:27 crc kubenswrapper[4929]: I1002 11:25:27.348913 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerDied","Data":"9324c1e00528cbd0cc10d4705505a64c2b0c451ccd2ba35957ed7c54b8710aa7"} Oct 02 11:25:27 crc kubenswrapper[4929]: I1002 11:25:27.351583 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerStarted","Data":"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f"} Oct 02 11:25:27 crc kubenswrapper[4929]: I1002 11:25:27.408330 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4625c" podStartSLOduration=4.805359518 podStartE2EDuration="11.408312814s" podCreationTimestamp="2025-10-02 11:25:16 +0000 UTC" firstStartedPulling="2025-10-02 11:25:19.819869922 +0000 UTC m=+920.370236286" lastFinishedPulling="2025-10-02 11:25:26.422823218 +0000 UTC m=+926.973189582" observedRunningTime="2025-10-02 11:25:27.408280413 +0000 UTC m=+927.958646777" watchObservedRunningTime="2025-10-02 11:25:27.408312814 +0000 UTC m=+927.958679178" Oct 02 11:25:28 crc kubenswrapper[4929]: I1002 11:25:28.361059 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"620d30a5d0bb9e911d7c607a2284f91a157dae1d0629ca288e8a9e82b985f410"} Oct 02 11:25:28 crc kubenswrapper[4929]: I1002 11:25:28.361457 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"c908c77ee54ea2781fbb97a8dfc2468235c9a97bb11c7fde08b350387f7454f3"} Oct 02 11:25:28 crc kubenswrapper[4929]: I1002 11:25:28.361472 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"f5ec140197a7a6edaf11ad58e2b125bc8b4094a7d00010bbe27d1242bb56dccc"} Oct 02 11:25:29 crc kubenswrapper[4929]: I1002 11:25:29.375130 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"763e8c1324c05c20ec9c604e120231cf279dd4cb4367b36906be2bf1c8dadd85"} Oct 02 11:25:29 crc kubenswrapper[4929]: I1002 11:25:29.375200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"074f437c22bbc4b3d5dec80ee824785ad5104f203e8497b65962da871405b0d9"} Oct 02 11:25:29 crc kubenswrapper[4929]: I1002 11:25:29.375218 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mnbch" event={"ID":"f8f57c23-45b1-47b0-bc42-33dc9b2f1f53","Type":"ContainerStarted","Data":"3905f7af59b75ddf91092418eeba019066ffcecc23f2906d1e55870d08bb3a7c"} Oct 02 11:25:29 crc kubenswrapper[4929]: I1002 11:25:29.375484 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:29 crc kubenswrapper[4929]: I1002 11:25:29.409397 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mnbch" podStartSLOduration=7.806006408 podStartE2EDuration="19.409370677s" podCreationTimestamp="2025-10-02 11:25:10 +0000 UTC" firstStartedPulling="2025-10-02 11:25:12.614583014 +0000 UTC m=+913.164949378" lastFinishedPulling="2025-10-02 11:25:24.217947283 +0000 UTC m=+924.768313647" observedRunningTime="2025-10-02 11:25:29.401801652 +0000 UTC m=+929.952168046" watchObservedRunningTime="2025-10-02 11:25:29.409370677 +0000 UTC m=+929.959737061" Oct 02 11:25:31 crc kubenswrapper[4929]: I1002 11:25:31.419449 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-rbghc" Oct 02 11:25:31 crc kubenswrapper[4929]: I1002 11:25:31.620696 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:31 crc kubenswrapper[4929]: I1002 11:25:31.654785 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:32 crc kubenswrapper[4929]: I1002 11:25:32.906668 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2j7bg" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.275823 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4"] Oct 02 11:25:34 crc kubenswrapper[4929]: E1002 11:25:34.276074 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="extract-utilities" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.276088 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="extract-utilities" Oct 02 11:25:34 crc kubenswrapper[4929]: E1002 11:25:34.276101 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="registry-server" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.276109 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="registry-server" Oct 02 11:25:34 crc kubenswrapper[4929]: E1002 11:25:34.276119 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="extract-content" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.276125 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="extract-content" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.276261 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3228ff9a-1b88-458e-b668-2724a3c4f19e" containerName="registry-server" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.276992 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.278435 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.286673 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4"] Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.425758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.425824 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.425855 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcgj9\" (UniqueName: \"kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.527123 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.527205 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.527245 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcgj9\" (UniqueName: \"kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.527588 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.527737 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.546844 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcgj9\" (UniqueName: \"kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.595594 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:34 crc kubenswrapper[4929]: I1002 11:25:34.823727 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4"] Oct 02 11:25:34 crc kubenswrapper[4929]: W1002 11:25:34.827344 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc650f07c_274b_4670_b136_d49448f2a3e4.slice/crio-ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48 WatchSource:0}: Error finding container ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48: Status 404 returned error can't find the container with id ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48 Oct 02 11:25:35 crc kubenswrapper[4929]: I1002 11:25:35.411623 4929 generic.go:334] "Generic (PLEG): container finished" podID="c650f07c-274b-4670-b136-d49448f2a3e4" containerID="6326eefe7c4e7d6b1af5c46bf34d541cab29a5eceff87f89c4acd5e907fc7445" exitCode=0 Oct 02 11:25:35 crc kubenswrapper[4929]: I1002 11:25:35.411675 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" event={"ID":"c650f07c-274b-4670-b136-d49448f2a3e4","Type":"ContainerDied","Data":"6326eefe7c4e7d6b1af5c46bf34d541cab29a5eceff87f89c4acd5e907fc7445"} Oct 02 11:25:35 crc kubenswrapper[4929]: I1002 11:25:35.411887 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" event={"ID":"c650f07c-274b-4670-b136-d49448f2a3e4","Type":"ContainerStarted","Data":"ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48"} Oct 02 11:25:36 crc kubenswrapper[4929]: I1002 11:25:36.662859 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:36 crc kubenswrapper[4929]: I1002 11:25:36.662903 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:36 crc kubenswrapper[4929]: I1002 11:25:36.707021 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:37 crc kubenswrapper[4929]: I1002 11:25:37.457945 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:39 crc kubenswrapper[4929]: I1002 11:25:39.633563 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:39 crc kubenswrapper[4929]: I1002 11:25:39.634357 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4625c" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="registry-server" containerID="cri-o://ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f" gracePeriod=2 Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.051443 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.203098 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities\") pod \"183c05c9-db31-4c65-9731-bb5529963bdd\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.203152 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdwl4\" (UniqueName: \"kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4\") pod \"183c05c9-db31-4c65-9731-bb5529963bdd\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.203397 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content\") pod \"183c05c9-db31-4c65-9731-bb5529963bdd\" (UID: \"183c05c9-db31-4c65-9731-bb5529963bdd\") " Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.203829 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities" (OuterVolumeSpecName: "utilities") pod "183c05c9-db31-4c65-9731-bb5529963bdd" (UID: "183c05c9-db31-4c65-9731-bb5529963bdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.207538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4" (OuterVolumeSpecName: "kube-api-access-rdwl4") pod "183c05c9-db31-4c65-9731-bb5529963bdd" (UID: "183c05c9-db31-4c65-9731-bb5529963bdd"). InnerVolumeSpecName "kube-api-access-rdwl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.254336 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "183c05c9-db31-4c65-9731-bb5529963bdd" (UID: "183c05c9-db31-4c65-9731-bb5529963bdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.305241 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.305272 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdwl4\" (UniqueName: \"kubernetes.io/projected/183c05c9-db31-4c65-9731-bb5529963bdd-kube-api-access-rdwl4\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.305282 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183c05c9-db31-4c65-9731-bb5529963bdd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.444544 4929 generic.go:334] "Generic (PLEG): container finished" podID="c650f07c-274b-4670-b136-d49448f2a3e4" containerID="32be3763347d7a4987a1cdb4a469133b029dfdd51af43abff02a6f4680cfee9b" exitCode=0 Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.444590 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" event={"ID":"c650f07c-274b-4670-b136-d49448f2a3e4","Type":"ContainerDied","Data":"32be3763347d7a4987a1cdb4a469133b029dfdd51af43abff02a6f4680cfee9b"} Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.458352 4929 generic.go:334] "Generic (PLEG): container finished" podID="183c05c9-db31-4c65-9731-bb5529963bdd" containerID="ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f" exitCode=0 Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.458399 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerDied","Data":"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f"} Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.458427 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4625c" event={"ID":"183c05c9-db31-4c65-9731-bb5529963bdd","Type":"ContainerDied","Data":"74149e233268472e71be0da0681616755974cdd18c00d84c36650479347e1bd9"} Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.458753 4929 scope.go:117] "RemoveContainer" containerID="ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.458787 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4625c" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.479018 4929 scope.go:117] "RemoveContainer" containerID="15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.496276 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.500224 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4625c"] Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.516083 4929 scope.go:117] "RemoveContainer" containerID="598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.531234 4929 scope.go:117] "RemoveContainer" containerID="ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f" Oct 02 11:25:40 crc kubenswrapper[4929]: E1002 11:25:40.531575 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f\": container with ID starting with ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f not found: ID does not exist" containerID="ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.531607 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f"} err="failed to get container status \"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f\": rpc error: code = NotFound desc = could not find container \"ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f\": container with ID starting with ae5ff40ccf6e7815b70857371da0c6d8b93e505b80079cb123145b277cd3ce7f not found: ID does not exist" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.531628 4929 scope.go:117] "RemoveContainer" containerID="15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847" Oct 02 11:25:40 crc kubenswrapper[4929]: E1002 11:25:40.531881 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847\": container with ID starting with 15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847 not found: ID does not exist" containerID="15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.531921 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847"} err="failed to get container status \"15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847\": rpc error: code = NotFound desc = could not find container \"15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847\": container with ID starting with 15ebb33aaf2989224d341c2119351cea163adc35b7c9606dbdfc357a90317847 not found: ID does not exist" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.531966 4929 scope.go:117] "RemoveContainer" containerID="598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158" Oct 02 11:25:40 crc kubenswrapper[4929]: E1002 11:25:40.532241 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158\": container with ID starting with 598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158 not found: ID does not exist" containerID="598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158" Oct 02 11:25:40 crc kubenswrapper[4929]: I1002 11:25:40.532261 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158"} err="failed to get container status \"598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158\": rpc error: code = NotFound desc = could not find container \"598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158\": container with ID starting with 598857edcc0840da3d18c41a17f1bf06ad30a9f1118accce3eb3055bc099d158 not found: ID does not exist" Oct 02 11:25:41 crc kubenswrapper[4929]: I1002 11:25:41.476643 4929 generic.go:334] "Generic (PLEG): container finished" podID="c650f07c-274b-4670-b136-d49448f2a3e4" containerID="519a88403d29116acf3a44bd6a8f915d6662983c7a87293ae2f83dae455945c6" exitCode=0 Oct 02 11:25:41 crc kubenswrapper[4929]: I1002 11:25:41.476864 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" event={"ID":"c650f07c-274b-4670-b136-d49448f2a3e4","Type":"ContainerDied","Data":"519a88403d29116acf3a44bd6a8f915d6662983c7a87293ae2f83dae455945c6"} Oct 02 11:25:41 crc kubenswrapper[4929]: I1002 11:25:41.615805 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-9l74k" Oct 02 11:25:41 crc kubenswrapper[4929]: I1002 11:25:41.623983 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mnbch" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.164814 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" path="/var/lib/kubelet/pods/183c05c9-db31-4c65-9731-bb5529963bdd/volumes" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.703896 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.736135 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcgj9\" (UniqueName: \"kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9\") pod \"c650f07c-274b-4670-b136-d49448f2a3e4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.736203 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util\") pod \"c650f07c-274b-4670-b136-d49448f2a3e4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.736235 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle\") pod \"c650f07c-274b-4670-b136-d49448f2a3e4\" (UID: \"c650f07c-274b-4670-b136-d49448f2a3e4\") " Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.737421 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle" (OuterVolumeSpecName: "bundle") pod "c650f07c-274b-4670-b136-d49448f2a3e4" (UID: "c650f07c-274b-4670-b136-d49448f2a3e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.741213 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9" (OuterVolumeSpecName: "kube-api-access-fcgj9") pod "c650f07c-274b-4670-b136-d49448f2a3e4" (UID: "c650f07c-274b-4670-b136-d49448f2a3e4"). InnerVolumeSpecName "kube-api-access-fcgj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.746066 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util" (OuterVolumeSpecName: "util") pod "c650f07c-274b-4670-b136-d49448f2a3e4" (UID: "c650f07c-274b-4670-b136-d49448f2a3e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.837472 4929 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.837499 4929 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c650f07c-274b-4670-b136-d49448f2a3e4-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:42 crc kubenswrapper[4929]: I1002 11:25:42.837508 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcgj9\" (UniqueName: \"kubernetes.io/projected/c650f07c-274b-4670-b136-d49448f2a3e4-kube-api-access-fcgj9\") on node \"crc\" DevicePath \"\"" Oct 02 11:25:43 crc kubenswrapper[4929]: I1002 11:25:43.494695 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" event={"ID":"c650f07c-274b-4670-b136-d49448f2a3e4","Type":"ContainerDied","Data":"ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48"} Oct 02 11:25:43 crc kubenswrapper[4929]: I1002 11:25:43.494748 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff5e3501c8708a32233e1d2adfeb1f2e6dd733df25d3c2f1ab9986d7c156cf48" Oct 02 11:25:43 crc kubenswrapper[4929]: I1002 11:25:43.494811 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.738909 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c"] Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739413 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="registry-server" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739426 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="registry-server" Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739435 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="extract-content" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739441 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="extract-content" Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739448 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="util" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739455 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="util" Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739465 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="extract-utilities" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739471 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="extract-utilities" Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739488 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="pull" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739495 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="pull" Oct 02 11:25:47 crc kubenswrapper[4929]: E1002 11:25:47.739505 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="extract" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739511 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="extract" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739608 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c650f07c-274b-4670-b136-d49448f2a3e4" containerName="extract" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.739620 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="183c05c9-db31-4c65-9731-bb5529963bdd" containerName="registry-server" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.740016 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.742651 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.742802 4929 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-tp4p9" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.742919 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.808943 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c"] Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.811553 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zmq\" (UniqueName: \"kubernetes.io/projected/28e20376-66b4-4ad9-8af1-b9eba76cdcff-kube-api-access-t4zmq\") pod \"cert-manager-operator-controller-manager-57cd46d6d-b8n9c\" (UID: \"28e20376-66b4-4ad9-8af1-b9eba76cdcff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.912818 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zmq\" (UniqueName: \"kubernetes.io/projected/28e20376-66b4-4ad9-8af1-b9eba76cdcff-kube-api-access-t4zmq\") pod \"cert-manager-operator-controller-manager-57cd46d6d-b8n9c\" (UID: \"28e20376-66b4-4ad9-8af1-b9eba76cdcff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" Oct 02 11:25:47 crc kubenswrapper[4929]: I1002 11:25:47.941537 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zmq\" (UniqueName: \"kubernetes.io/projected/28e20376-66b4-4ad9-8af1-b9eba76cdcff-kube-api-access-t4zmq\") pod \"cert-manager-operator-controller-manager-57cd46d6d-b8n9c\" (UID: \"28e20376-66b4-4ad9-8af1-b9eba76cdcff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" Oct 02 11:25:48 crc kubenswrapper[4929]: I1002 11:25:48.053353 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" Oct 02 11:25:48 crc kubenswrapper[4929]: I1002 11:25:48.246498 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c"] Oct 02 11:25:48 crc kubenswrapper[4929]: W1002 11:25:48.250277 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e20376_66b4_4ad9_8af1_b9eba76cdcff.slice/crio-66085bbaec12ff18d4011e318fe314148b3ea0ba833d1698bdcf423355344700 WatchSource:0}: Error finding container 66085bbaec12ff18d4011e318fe314148b3ea0ba833d1698bdcf423355344700: Status 404 returned error can't find the container with id 66085bbaec12ff18d4011e318fe314148b3ea0ba833d1698bdcf423355344700 Oct 02 11:25:48 crc kubenswrapper[4929]: I1002 11:25:48.520633 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" event={"ID":"28e20376-66b4-4ad9-8af1-b9eba76cdcff","Type":"ContainerStarted","Data":"66085bbaec12ff18d4011e318fe314148b3ea0ba833d1698bdcf423355344700"} Oct 02 11:25:55 crc kubenswrapper[4929]: I1002 11:25:55.607060 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" event={"ID":"28e20376-66b4-4ad9-8af1-b9eba76cdcff","Type":"ContainerStarted","Data":"e98dfdc0cb3ce490d0fad61ba02b979262d7b0267fca6be7259c837b4a670446"} Oct 02 11:25:55 crc kubenswrapper[4929]: I1002 11:25:55.632059 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-b8n9c" podStartSLOduration=1.801522795 podStartE2EDuration="8.632041637s" podCreationTimestamp="2025-10-02 11:25:47 +0000 UTC" firstStartedPulling="2025-10-02 11:25:48.253322126 +0000 UTC m=+948.803688490" lastFinishedPulling="2025-10-02 11:25:55.083840968 +0000 UTC m=+955.634207332" observedRunningTime="2025-10-02 11:25:55.627634162 +0000 UTC m=+956.178000536" watchObservedRunningTime="2025-10-02 11:25:55.632041637 +0000 UTC m=+956.182408001" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.523668 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2746m"] Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.524860 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.528607 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.529328 4929 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-82kb5" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.530706 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.535097 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2746m"] Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.559645 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzqb9\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-kube-api-access-wzqb9\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.559980 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.661287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.661481 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzqb9\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-kube-api-access-wzqb9\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.680401 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzqb9\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-kube-api-access-wzqb9\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.682999 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca8b679f-63d7-48c7-bd1b-a37148b857f7-bound-sa-token\") pod \"cert-manager-webhook-d969966f-2746m\" (UID: \"ca8b679f-63d7-48c7-bd1b-a37148b857f7\") " pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:57 crc kubenswrapper[4929]: I1002 11:25:57.844039 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:25:58 crc kubenswrapper[4929]: I1002 11:25:58.294117 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-2746m"] Oct 02 11:25:58 crc kubenswrapper[4929]: W1002 11:25:58.302655 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca8b679f_63d7_48c7_bd1b_a37148b857f7.slice/crio-ed496be7cf8f89581c0882673aa030e76dfadeaa37ca3d8593c2e17bc4d31f65 WatchSource:0}: Error finding container ed496be7cf8f89581c0882673aa030e76dfadeaa37ca3d8593c2e17bc4d31f65: Status 404 returned error can't find the container with id ed496be7cf8f89581c0882673aa030e76dfadeaa37ca3d8593c2e17bc4d31f65 Oct 02 11:25:58 crc kubenswrapper[4929]: I1002 11:25:58.624859 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-2746m" event={"ID":"ca8b679f-63d7-48c7-bd1b-a37148b857f7","Type":"ContainerStarted","Data":"ed496be7cf8f89581c0882673aa030e76dfadeaa37ca3d8593c2e17bc4d31f65"} Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.099323 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l"] Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.101334 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.107433 4929 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-l9bhb" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.111446 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l"] Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.206328 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74bc\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-kube-api-access-w74bc\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.206379 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.307641 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74bc\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-kube-api-access-w74bc\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.307724 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.345292 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74bc\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-kube-api-access-w74bc\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.345599 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09157fd6-e304-4dcc-82c3-724f2ff5d23c-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-8xb2l\" (UID: \"09157fd6-e304-4dcc-82c3-724f2ff5d23c\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.425049 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" Oct 02 11:26:00 crc kubenswrapper[4929]: I1002 11:26:00.659298 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l"] Oct 02 11:26:02 crc kubenswrapper[4929]: W1002 11:26:02.150344 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09157fd6_e304_4dcc_82c3_724f2ff5d23c.slice/crio-cb994029df71d4a18cb8bc89f395acc594fd4dce0522178d6c7542d148920ba2 WatchSource:0}: Error finding container cb994029df71d4a18cb8bc89f395acc594fd4dce0522178d6c7542d148920ba2: Status 404 returned error can't find the container with id cb994029df71d4a18cb8bc89f395acc594fd4dce0522178d6c7542d148920ba2 Oct 02 11:26:02 crc kubenswrapper[4929]: I1002 11:26:02.659739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-2746m" event={"ID":"ca8b679f-63d7-48c7-bd1b-a37148b857f7","Type":"ContainerStarted","Data":"d1047f1a97d050306e3f2d2224ab487c5eed3879ded49eb26b0c062c43eb2a95"} Oct 02 11:26:02 crc kubenswrapper[4929]: I1002 11:26:02.659977 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:26:02 crc kubenswrapper[4929]: I1002 11:26:02.664146 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" event={"ID":"09157fd6-e304-4dcc-82c3-724f2ff5d23c","Type":"ContainerStarted","Data":"cb994029df71d4a18cb8bc89f395acc594fd4dce0522178d6c7542d148920ba2"} Oct 02 11:26:02 crc kubenswrapper[4929]: I1002 11:26:02.678298 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-2746m" podStartSLOduration=1.743764471 podStartE2EDuration="5.678168318s" podCreationTimestamp="2025-10-02 11:25:57 +0000 UTC" firstStartedPulling="2025-10-02 11:25:58.306015861 +0000 UTC m=+958.856382225" lastFinishedPulling="2025-10-02 11:26:02.240419708 +0000 UTC m=+962.790786072" observedRunningTime="2025-10-02 11:26:02.677819298 +0000 UTC m=+963.228185662" watchObservedRunningTime="2025-10-02 11:26:02.678168318 +0000 UTC m=+963.228534682" Oct 02 11:26:03 crc kubenswrapper[4929]: I1002 11:26:03.670989 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" event={"ID":"09157fd6-e304-4dcc-82c3-724f2ff5d23c","Type":"ContainerStarted","Data":"89da9e1e677e74bbb4911178fe0b87eaaf10e064f873741dc0db457deb7cc2b4"} Oct 02 11:26:03 crc kubenswrapper[4929]: I1002 11:26:03.685577 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-8xb2l" podStartSLOduration=2.94511701 podStartE2EDuration="3.685557264s" podCreationTimestamp="2025-10-02 11:26:00 +0000 UTC" firstStartedPulling="2025-10-02 11:26:02.155115552 +0000 UTC m=+962.705481916" lastFinishedPulling="2025-10-02 11:26:02.895555816 +0000 UTC m=+963.445922170" observedRunningTime="2025-10-02 11:26:03.685133852 +0000 UTC m=+964.235500226" watchObservedRunningTime="2025-10-02 11:26:03.685557264 +0000 UTC m=+964.235923648" Oct 02 11:26:07 crc kubenswrapper[4929]: I1002 11:26:07.847373 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-2746m" Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.814588 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5cwl"] Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.815850 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.818709 4929 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zcz7h" Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.830091 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5cwl"] Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.952669 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:16 crc kubenswrapper[4929]: I1002 11:26:16.952763 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d7w\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-kube-api-access-g8d7w\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.054704 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.054895 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d7w\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-kube-api-access-g8d7w\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.079537 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d7w\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-kube-api-access-g8d7w\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.082781 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/392aa533-50c6-4fa9-993d-22ff65870e4a-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5cwl\" (UID: \"392aa533-50c6-4fa9-993d-22ff65870e4a\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.133108 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.559862 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5cwl"] Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.744154 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" event={"ID":"392aa533-50c6-4fa9-993d-22ff65870e4a","Type":"ContainerStarted","Data":"b7ef2f4e5f6f98d7148a0de03ad301613c73e049d3d947440b633280a70ced0b"} Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.744200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" event={"ID":"392aa533-50c6-4fa9-993d-22ff65870e4a","Type":"ContainerStarted","Data":"4b961ae023eb4f4c4c2c49c91ffd0f13f4ca44a48e5a15734661df5857da0f63"} Oct 02 11:26:17 crc kubenswrapper[4929]: I1002 11:26:17.759592 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-g5cwl" podStartSLOduration=1.759574378 podStartE2EDuration="1.759574378s" podCreationTimestamp="2025-10-02 11:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:26:17.758633422 +0000 UTC m=+978.308999786" watchObservedRunningTime="2025-10-02 11:26:17.759574378 +0000 UTC m=+978.309940742" Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.935640 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.936658 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.939642 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xbx2b" Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.939661 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.939928 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 11:26:20 crc kubenswrapper[4929]: I1002 11:26:20.947812 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.012765 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khp9q\" (UniqueName: \"kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q\") pod \"openstack-operator-index-hhk94\" (UID: \"9933fce8-c766-475d-91c9-2e498520c360\") " pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.115225 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khp9q\" (UniqueName: \"kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q\") pod \"openstack-operator-index-hhk94\" (UID: \"9933fce8-c766-475d-91c9-2e498520c360\") " pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.134859 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khp9q\" (UniqueName: \"kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q\") pod \"openstack-operator-index-hhk94\" (UID: \"9933fce8-c766-475d-91c9-2e498520c360\") " pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.298546 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.685469 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:21 crc kubenswrapper[4929]: I1002 11:26:21.770416 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hhk94" event={"ID":"9933fce8-c766-475d-91c9-2e498520c360","Type":"ContainerStarted","Data":"1ab8e7b70bdfbd34885e1ffbbf8613ad677159f63d6196d270fc76f6fb01eb1d"} Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.118034 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.737040 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dd9vd"] Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.739162 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.751308 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dd9vd"] Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.874222 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jh5\" (UniqueName: \"kubernetes.io/projected/2298fe8f-5cad-4612-b33b-6941a43c5a63-kube-api-access-58jh5\") pod \"openstack-operator-index-dd9vd\" (UID: \"2298fe8f-5cad-4612-b33b-6941a43c5a63\") " pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.975950 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58jh5\" (UniqueName: \"kubernetes.io/projected/2298fe8f-5cad-4612-b33b-6941a43c5a63-kube-api-access-58jh5\") pod \"openstack-operator-index-dd9vd\" (UID: \"2298fe8f-5cad-4612-b33b-6941a43c5a63\") " pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:24 crc kubenswrapper[4929]: I1002 11:26:24.994860 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jh5\" (UniqueName: \"kubernetes.io/projected/2298fe8f-5cad-4612-b33b-6941a43c5a63-kube-api-access-58jh5\") pod \"openstack-operator-index-dd9vd\" (UID: \"2298fe8f-5cad-4612-b33b-6941a43c5a63\") " pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:25 crc kubenswrapper[4929]: I1002 11:26:25.063607 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:31 crc kubenswrapper[4929]: I1002 11:26:31.761269 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dd9vd"] Oct 02 11:26:32 crc kubenswrapper[4929]: W1002 11:26:32.166770 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2298fe8f_5cad_4612_b33b_6941a43c5a63.slice/crio-8447fe85c40f34185d64c5db15b8a2e012dffc4876ef96d2f5ed624df6bfeb2f WatchSource:0}: Error finding container 8447fe85c40f34185d64c5db15b8a2e012dffc4876ef96d2f5ed624df6bfeb2f: Status 404 returned error can't find the container with id 8447fe85c40f34185d64c5db15b8a2e012dffc4876ef96d2f5ed624df6bfeb2f Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.866242 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hhk94" event={"ID":"9933fce8-c766-475d-91c9-2e498520c360","Type":"ContainerStarted","Data":"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6"} Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.866359 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hhk94" podUID="9933fce8-c766-475d-91c9-2e498520c360" containerName="registry-server" containerID="cri-o://bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6" gracePeriod=2 Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.870551 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dd9vd" event={"ID":"2298fe8f-5cad-4612-b33b-6941a43c5a63","Type":"ContainerStarted","Data":"742d4f4d49ebac55427e1d6e0f432ccb5eeb56586254ecde9746856a4ed51ec8"} Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.870606 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dd9vd" event={"ID":"2298fe8f-5cad-4612-b33b-6941a43c5a63","Type":"ContainerStarted","Data":"8447fe85c40f34185d64c5db15b8a2e012dffc4876ef96d2f5ed624df6bfeb2f"} Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.884030 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hhk94" podStartSLOduration=2.377590709 podStartE2EDuration="12.883949044s" podCreationTimestamp="2025-10-02 11:26:20 +0000 UTC" firstStartedPulling="2025-10-02 11:26:21.693128211 +0000 UTC m=+982.243494575" lastFinishedPulling="2025-10-02 11:26:32.199486556 +0000 UTC m=+992.749852910" observedRunningTime="2025-10-02 11:26:32.879864329 +0000 UTC m=+993.430230693" watchObservedRunningTime="2025-10-02 11:26:32.883949044 +0000 UTC m=+993.434315408" Oct 02 11:26:32 crc kubenswrapper[4929]: I1002 11:26:32.895906 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dd9vd" podStartSLOduration=8.835107861000001 podStartE2EDuration="8.895890743s" podCreationTimestamp="2025-10-02 11:26:24 +0000 UTC" firstStartedPulling="2025-10-02 11:26:32.187411573 +0000 UTC m=+992.737777937" lastFinishedPulling="2025-10-02 11:26:32.248194455 +0000 UTC m=+992.798560819" observedRunningTime="2025-10-02 11:26:32.893809504 +0000 UTC m=+993.444175868" watchObservedRunningTime="2025-10-02 11:26:32.895890743 +0000 UTC m=+993.446257107" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.194930 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.366321 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khp9q\" (UniqueName: \"kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q\") pod \"9933fce8-c766-475d-91c9-2e498520c360\" (UID: \"9933fce8-c766-475d-91c9-2e498520c360\") " Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.374149 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q" (OuterVolumeSpecName: "kube-api-access-khp9q") pod "9933fce8-c766-475d-91c9-2e498520c360" (UID: "9933fce8-c766-475d-91c9-2e498520c360"). InnerVolumeSpecName "kube-api-access-khp9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.467665 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khp9q\" (UniqueName: \"kubernetes.io/projected/9933fce8-c766-475d-91c9-2e498520c360-kube-api-access-khp9q\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.880514 4929 generic.go:334] "Generic (PLEG): container finished" podID="9933fce8-c766-475d-91c9-2e498520c360" containerID="bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6" exitCode=0 Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.880571 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hhk94" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.880625 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hhk94" event={"ID":"9933fce8-c766-475d-91c9-2e498520c360","Type":"ContainerDied","Data":"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6"} Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.880684 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hhk94" event={"ID":"9933fce8-c766-475d-91c9-2e498520c360","Type":"ContainerDied","Data":"1ab8e7b70bdfbd34885e1ffbbf8613ad677159f63d6196d270fc76f6fb01eb1d"} Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.880709 4929 scope.go:117] "RemoveContainer" containerID="bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.910542 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.912260 4929 scope.go:117] "RemoveContainer" containerID="bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6" Oct 02 11:26:33 crc kubenswrapper[4929]: E1002 11:26:33.912696 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6\": container with ID starting with bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6 not found: ID does not exist" containerID="bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.912739 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6"} err="failed to get container status \"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6\": rpc error: code = NotFound desc = could not find container \"bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6\": container with ID starting with bdae5fa4136d351ab837a9ba742602aaf278e4a7dd79dccd7a413ffe1ca1b1c6 not found: ID does not exist" Oct 02 11:26:33 crc kubenswrapper[4929]: I1002 11:26:33.916922 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hhk94"] Oct 02 11:26:34 crc kubenswrapper[4929]: I1002 11:26:34.169639 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9933fce8-c766-475d-91c9-2e498520c360" path="/var/lib/kubelet/pods/9933fce8-c766-475d-91c9-2e498520c360/volumes" Oct 02 11:26:35 crc kubenswrapper[4929]: I1002 11:26:35.064329 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:35 crc kubenswrapper[4929]: I1002 11:26:35.064378 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:35 crc kubenswrapper[4929]: I1002 11:26:35.096225 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:44 crc kubenswrapper[4929]: I1002 11:26:44.736730 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:26:44 crc kubenswrapper[4929]: I1002 11:26:44.738120 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:26:45 crc kubenswrapper[4929]: I1002 11:26:45.091483 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dd9vd" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.728634 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms"] Oct 02 11:26:51 crc kubenswrapper[4929]: E1002 11:26:51.729681 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9933fce8-c766-475d-91c9-2e498520c360" containerName="registry-server" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.729697 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9933fce8-c766-475d-91c9-2e498520c360" containerName="registry-server" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.729850 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9933fce8-c766-475d-91c9-2e498520c360" containerName="registry-server" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.730900 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.733090 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4tb5d" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.733573 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms"] Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.824014 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fqw\" (UniqueName: \"kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.824098 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.824244 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.925617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.925717 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.925783 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fqw\" (UniqueName: \"kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.926235 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.926459 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:51 crc kubenswrapper[4929]: I1002 11:26:51.952316 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fqw\" (UniqueName: \"kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw\") pod \"fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:52 crc kubenswrapper[4929]: I1002 11:26:52.057506 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:52 crc kubenswrapper[4929]: I1002 11:26:52.457134 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms"] Oct 02 11:26:52 crc kubenswrapper[4929]: I1002 11:26:52.993371 4929 generic.go:334] "Generic (PLEG): container finished" podID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerID="6de64f0bd31d67ae354228f64806c9b9ec7f595b16639dfa0740e30b5ae1372e" exitCode=0 Oct 02 11:26:52 crc kubenswrapper[4929]: I1002 11:26:52.993484 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" event={"ID":"a7303a1a-93cc-40f2-b087-f5b6723797f8","Type":"ContainerDied","Data":"6de64f0bd31d67ae354228f64806c9b9ec7f595b16639dfa0740e30b5ae1372e"} Oct 02 11:26:52 crc kubenswrapper[4929]: I1002 11:26:52.993616 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" event={"ID":"a7303a1a-93cc-40f2-b087-f5b6723797f8","Type":"ContainerStarted","Data":"bf034a210d846d4c23c0c988f07ce3fd1c98b95b5fee213c7b10700fa5a0e6c4"} Oct 02 11:26:54 crc kubenswrapper[4929]: I1002 11:26:54.004010 4929 generic.go:334] "Generic (PLEG): container finished" podID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerID="96ffcf1d6a2f609ec42fba0bbefdb29cd71f667e7a441d8353fefac26a88a3b0" exitCode=0 Oct 02 11:26:54 crc kubenswrapper[4929]: I1002 11:26:54.004092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" event={"ID":"a7303a1a-93cc-40f2-b087-f5b6723797f8","Type":"ContainerDied","Data":"96ffcf1d6a2f609ec42fba0bbefdb29cd71f667e7a441d8353fefac26a88a3b0"} Oct 02 11:26:55 crc kubenswrapper[4929]: I1002 11:26:55.012943 4929 generic.go:334] "Generic (PLEG): container finished" podID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerID="5b33437895c07689d2aeeccfef89b00865ff84fce5387f9982893abc9630977b" exitCode=0 Oct 02 11:26:55 crc kubenswrapper[4929]: I1002 11:26:55.013020 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" event={"ID":"a7303a1a-93cc-40f2-b087-f5b6723797f8","Type":"ContainerDied","Data":"5b33437895c07689d2aeeccfef89b00865ff84fce5387f9982893abc9630977b"} Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.274581 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.392045 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle\") pod \"a7303a1a-93cc-40f2-b087-f5b6723797f8\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.392221 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fqw\" (UniqueName: \"kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw\") pod \"a7303a1a-93cc-40f2-b087-f5b6723797f8\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.392264 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util\") pod \"a7303a1a-93cc-40f2-b087-f5b6723797f8\" (UID: \"a7303a1a-93cc-40f2-b087-f5b6723797f8\") " Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.392923 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle" (OuterVolumeSpecName: "bundle") pod "a7303a1a-93cc-40f2-b087-f5b6723797f8" (UID: "a7303a1a-93cc-40f2-b087-f5b6723797f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.398908 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw" (OuterVolumeSpecName: "kube-api-access-s5fqw") pod "a7303a1a-93cc-40f2-b087-f5b6723797f8" (UID: "a7303a1a-93cc-40f2-b087-f5b6723797f8"). InnerVolumeSpecName "kube-api-access-s5fqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.408490 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util" (OuterVolumeSpecName: "util") pod "a7303a1a-93cc-40f2-b087-f5b6723797f8" (UID: "a7303a1a-93cc-40f2-b087-f5b6723797f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.493812 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fqw\" (UniqueName: \"kubernetes.io/projected/a7303a1a-93cc-40f2-b087-f5b6723797f8-kube-api-access-s5fqw\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.493861 4929 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:56 crc kubenswrapper[4929]: I1002 11:26:56.493871 4929 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7303a1a-93cc-40f2-b087-f5b6723797f8-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:57 crc kubenswrapper[4929]: I1002 11:26:57.033745 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" event={"ID":"a7303a1a-93cc-40f2-b087-f5b6723797f8","Type":"ContainerDied","Data":"bf034a210d846d4c23c0c988f07ce3fd1c98b95b5fee213c7b10700fa5a0e6c4"} Oct 02 11:26:57 crc kubenswrapper[4929]: I1002 11:26:57.033796 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf034a210d846d4c23c0c988f07ce3fd1c98b95b5fee213c7b10700fa5a0e6c4" Oct 02 11:26:57 crc kubenswrapper[4929]: I1002 11:26:57.033821 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.629856 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2"] Oct 02 11:26:59 crc kubenswrapper[4929]: E1002 11:26:59.630447 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="pull" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.630464 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="pull" Oct 02 11:26:59 crc kubenswrapper[4929]: E1002 11:26:59.630479 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="util" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.630488 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="util" Oct 02 11:26:59 crc kubenswrapper[4929]: E1002 11:26:59.630503 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="extract" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.630513 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="extract" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.630671 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7303a1a-93cc-40f2-b087-f5b6723797f8" containerName="extract" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.631481 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.637235 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj88t\" (UniqueName: \"kubernetes.io/projected/79d4fb9c-c8e4-427a-b7f8-3889ed40703a-kube-api-access-kj88t\") pod \"openstack-operator-controller-operator-5f5c85554c-9k8x2\" (UID: \"79d4fb9c-c8e4-427a-b7f8-3889ed40703a\") " pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.637477 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-g29lf" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.665191 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2"] Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.738233 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj88t\" (UniqueName: \"kubernetes.io/projected/79d4fb9c-c8e4-427a-b7f8-3889ed40703a-kube-api-access-kj88t\") pod \"openstack-operator-controller-operator-5f5c85554c-9k8x2\" (UID: \"79d4fb9c-c8e4-427a-b7f8-3889ed40703a\") " pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.758241 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj88t\" (UniqueName: \"kubernetes.io/projected/79d4fb9c-c8e4-427a-b7f8-3889ed40703a-kube-api-access-kj88t\") pod \"openstack-operator-controller-operator-5f5c85554c-9k8x2\" (UID: \"79d4fb9c-c8e4-427a-b7f8-3889ed40703a\") " pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:26:59 crc kubenswrapper[4929]: I1002 11:26:59.949308 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:27:00 crc kubenswrapper[4929]: I1002 11:27:00.182856 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2"] Oct 02 11:27:01 crc kubenswrapper[4929]: I1002 11:27:01.060019 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" event={"ID":"79d4fb9c-c8e4-427a-b7f8-3889ed40703a","Type":"ContainerStarted","Data":"deb10ec6ab0b21b51939f0f0ef8cc7149cee7d4f2cfe1d5349cb38cded7b69a8"} Oct 02 11:27:07 crc kubenswrapper[4929]: I1002 11:27:07.105409 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" event={"ID":"79d4fb9c-c8e4-427a-b7f8-3889ed40703a","Type":"ContainerStarted","Data":"6568f84b5ebc128e76a022f3bfc4bcb6e1d4d79bec6486f4531c12094e9c8ee0"} Oct 02 11:27:11 crc kubenswrapper[4929]: I1002 11:27:11.129989 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" event={"ID":"79d4fb9c-c8e4-427a-b7f8-3889ed40703a","Type":"ContainerStarted","Data":"bae0f8b85a0ec11e265494d838b6d08d780e4b93b42a26c252b038c89897bd6f"} Oct 02 11:27:11 crc kubenswrapper[4929]: I1002 11:27:11.130314 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:27:11 crc kubenswrapper[4929]: I1002 11:27:11.135461 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" Oct 02 11:27:11 crc kubenswrapper[4929]: I1002 11:27:11.165409 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5f5c85554c-9k8x2" podStartSLOduration=1.414349197 podStartE2EDuration="12.165393504s" podCreationTimestamp="2025-10-02 11:26:59 +0000 UTC" firstStartedPulling="2025-10-02 11:27:00.194127549 +0000 UTC m=+1020.744493913" lastFinishedPulling="2025-10-02 11:27:10.945171856 +0000 UTC m=+1031.495538220" observedRunningTime="2025-10-02 11:27:11.164306763 +0000 UTC m=+1031.714673137" watchObservedRunningTime="2025-10-02 11:27:11.165393504 +0000 UTC m=+1031.715759868" Oct 02 11:27:14 crc kubenswrapper[4929]: I1002 11:27:14.736577 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:14 crc kubenswrapper[4929]: I1002 11:27:14.736865 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.780369 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.781703 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.783843 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nt9n7" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.788078 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.789247 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.790985 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-g8s6z" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.802173 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.805802 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.814825 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.816164 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.820016 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tn4mq" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.872836 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.878683 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.879694 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.883515 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-bh6bg" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.899344 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrrmw\" (UniqueName: \"kubernetes.io/projected/7fc88030-43e9-4a64-a887-f8db65808659-kube-api-access-nrrmw\") pod \"barbican-operator-controller-manager-6ff8b75857-w4lnt\" (UID: \"7fc88030-43e9-4a64-a887-f8db65808659\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.899418 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8sd\" (UniqueName: \"kubernetes.io/projected/bc30a5a5-e775-4c1d-a644-6072c3b0eeea-kube-api-access-pz8sd\") pod \"glance-operator-controller-manager-84958c4d49-9qljm\" (UID: \"bc30a5a5-e775-4c1d-a644-6072c3b0eeea\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.899490 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzdz\" (UniqueName: \"kubernetes.io/projected/dbbebfcb-a8c5-411f-a280-9d225411602e-kube-api-access-7kzdz\") pod \"designate-operator-controller-manager-84f4f7b77b-jvvpf\" (UID: \"dbbebfcb-a8c5-411f-a280-9d225411602e\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.899570 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2tnf\" (UniqueName: \"kubernetes.io/projected/e640cd16-0a71-4f55-a6bd-154d16e32427-kube-api-access-f2tnf\") pod \"cinder-operator-controller-manager-644bddb6d8-vbcph\" (UID: \"e640cd16-0a71-4f55-a6bd-154d16e32427\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.940004 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.945395 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.952768 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wcphq" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.970509 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.976413 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.977925 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv"] Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.979041 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.991223 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xjmmt" Oct 02 11:27:39 crc kubenswrapper[4929]: I1002 11:27:39.998223 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.000740 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2tnf\" (UniqueName: \"kubernetes.io/projected/e640cd16-0a71-4f55-a6bd-154d16e32427-kube-api-access-f2tnf\") pod \"cinder-operator-controller-manager-644bddb6d8-vbcph\" (UID: \"e640cd16-0a71-4f55-a6bd-154d16e32427\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.000821 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrrmw\" (UniqueName: \"kubernetes.io/projected/7fc88030-43e9-4a64-a887-f8db65808659-kube-api-access-nrrmw\") pod \"barbican-operator-controller-manager-6ff8b75857-w4lnt\" (UID: \"7fc88030-43e9-4a64-a887-f8db65808659\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.000840 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8sd\" (UniqueName: \"kubernetes.io/projected/bc30a5a5-e775-4c1d-a644-6072c3b0eeea-kube-api-access-pz8sd\") pod \"glance-operator-controller-manager-84958c4d49-9qljm\" (UID: \"bc30a5a5-e775-4c1d-a644-6072c3b0eeea\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.000875 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzdz\" (UniqueName: \"kubernetes.io/projected/dbbebfcb-a8c5-411f-a280-9d225411602e-kube-api-access-7kzdz\") pod \"designate-operator-controller-manager-84f4f7b77b-jvvpf\" (UID: \"dbbebfcb-a8c5-411f-a280-9d225411602e\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.007995 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.009152 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.010882 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ht6sl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.011087 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.014886 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.016217 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.018238 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xtn6j" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.041176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8sd\" (UniqueName: \"kubernetes.io/projected/bc30a5a5-e775-4c1d-a644-6072c3b0eeea-kube-api-access-pz8sd\") pod \"glance-operator-controller-manager-84958c4d49-9qljm\" (UID: \"bc30a5a5-e775-4c1d-a644-6072c3b0eeea\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.041258 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.045291 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzdz\" (UniqueName: \"kubernetes.io/projected/dbbebfcb-a8c5-411f-a280-9d225411602e-kube-api-access-7kzdz\") pod \"designate-operator-controller-manager-84f4f7b77b-jvvpf\" (UID: \"dbbebfcb-a8c5-411f-a280-9d225411602e\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.046251 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2tnf\" (UniqueName: \"kubernetes.io/projected/e640cd16-0a71-4f55-a6bd-154d16e32427-kube-api-access-f2tnf\") pod \"cinder-operator-controller-manager-644bddb6d8-vbcph\" (UID: \"e640cd16-0a71-4f55-a6bd-154d16e32427\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.058329 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrrmw\" (UniqueName: \"kubernetes.io/projected/7fc88030-43e9-4a64-a887-f8db65808659-kube-api-access-nrrmw\") pod \"barbican-operator-controller-manager-6ff8b75857-w4lnt\" (UID: \"7fc88030-43e9-4a64-a887-f8db65808659\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.072131 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.075785 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jb2s5" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.079701 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.088112 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.102821 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnztg\" (UniqueName: \"kubernetes.io/projected/d7bca063-4f2b-4430-bdeb-c018ca23445b-kube-api-access-gnztg\") pod \"keystone-operator-controller-manager-5bd55b4bff-6rxh4\" (UID: \"d7bca063-4f2b-4430-bdeb-c018ca23445b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.103354 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m62l\" (UniqueName: \"kubernetes.io/projected/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-kube-api-access-4m62l\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.103444 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8bl\" (UniqueName: \"kubernetes.io/projected/315d86d0-376d-45a1-8bb1-bdb533e0a3fd-kube-api-access-jv8bl\") pod \"ironic-operator-controller-manager-5cd4858477-7q5cl\" (UID: \"315d86d0-376d-45a1-8bb1-bdb533e0a3fd\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.103550 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sqw\" (UniqueName: \"kubernetes.io/projected/31731ea3-445c-4949-826f-012b32eb8737-kube-api-access-x6sqw\") pod \"heat-operator-controller-manager-5d889d78cf-lpvb9\" (UID: \"31731ea3-445c-4949-826f-012b32eb8737\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.103642 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgwv\" (UniqueName: \"kubernetes.io/projected/8bc7a024-b6bf-46df-a6db-a27e3de0316b-kube-api-access-8dgwv\") pod \"horizon-operator-controller-manager-9f4696d94-kg8nv\" (UID: \"8bc7a024-b6bf-46df-a6db-a27e3de0316b\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.103749 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-cert\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.105629 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.106847 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.114378 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.115595 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.116457 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.117254 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9ztmt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.129019 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-9kptb"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.130408 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.131514 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.132737 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.136620 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hcntm" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.144998 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.150299 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-phf86" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.151701 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.153939 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.154161 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.155927 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-9kptb"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.157858 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bshng" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.178574 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.191728 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.193525 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205763 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m62l\" (UniqueName: \"kubernetes.io/projected/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-kube-api-access-4m62l\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205803 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv8bl\" (UniqueName: \"kubernetes.io/projected/315d86d0-376d-45a1-8bb1-bdb533e0a3fd-kube-api-access-jv8bl\") pod \"ironic-operator-controller-manager-5cd4858477-7q5cl\" (UID: \"315d86d0-376d-45a1-8bb1-bdb533e0a3fd\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205839 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sqw\" (UniqueName: \"kubernetes.io/projected/31731ea3-445c-4949-826f-012b32eb8737-kube-api-access-x6sqw\") pod \"heat-operator-controller-manager-5d889d78cf-lpvb9\" (UID: \"31731ea3-445c-4949-826f-012b32eb8737\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205864 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgwv\" (UniqueName: \"kubernetes.io/projected/8bc7a024-b6bf-46df-a6db-a27e3de0316b-kube-api-access-8dgwv\") pod \"horizon-operator-controller-manager-9f4696d94-kg8nv\" (UID: \"8bc7a024-b6bf-46df-a6db-a27e3de0316b\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205914 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-cert\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.205938 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnztg\" (UniqueName: \"kubernetes.io/projected/d7bca063-4f2b-4430-bdeb-c018ca23445b-kube-api-access-gnztg\") pod \"keystone-operator-controller-manager-5bd55b4bff-6rxh4\" (UID: \"d7bca063-4f2b-4430-bdeb-c018ca23445b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.206066 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.206849 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-dsfp5" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.213518 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-cert\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.226830 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgwv\" (UniqueName: \"kubernetes.io/projected/8bc7a024-b6bf-46df-a6db-a27e3de0316b-kube-api-access-8dgwv\") pod \"horizon-operator-controller-manager-9f4696d94-kg8nv\" (UID: \"8bc7a024-b6bf-46df-a6db-a27e3de0316b\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.228507 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.230350 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnztg\" (UniqueName: \"kubernetes.io/projected/d7bca063-4f2b-4430-bdeb-c018ca23445b-kube-api-access-gnztg\") pod \"keystone-operator-controller-manager-5bd55b4bff-6rxh4\" (UID: \"d7bca063-4f2b-4430-bdeb-c018ca23445b\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.230486 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m62l\" (UniqueName: \"kubernetes.io/projected/6c83fc23-d663-403f-a7d2-2f2398b6d0a3-kube-api-access-4m62l\") pod \"infra-operator-controller-manager-9d6c5db85-s5528\" (UID: \"6c83fc23-d663-403f-a7d2-2f2398b6d0a3\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.230949 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv8bl\" (UniqueName: \"kubernetes.io/projected/315d86d0-376d-45a1-8bb1-bdb533e0a3fd-kube-api-access-jv8bl\") pod \"ironic-operator-controller-manager-5cd4858477-7q5cl\" (UID: \"315d86d0-376d-45a1-8bb1-bdb533e0a3fd\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.234622 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.235631 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.238330 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.239340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sqw\" (UniqueName: \"kubernetes.io/projected/31731ea3-445c-4949-826f-012b32eb8737-kube-api-access-x6sqw\") pod \"heat-operator-controller-manager-5d889d78cf-lpvb9\" (UID: \"31731ea3-445c-4949-826f-012b32eb8737\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.241345 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.242268 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mfqqg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.249466 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.276519 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.277336 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g2zp8" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.291433 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.296369 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.310018 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2w9\" (UniqueName: \"kubernetes.io/projected/66bdd8cf-6f68-48c3-af01-1785234e988f-kube-api-access-4b2w9\") pod \"neutron-operator-controller-manager-849d5b9b84-52g6g\" (UID: \"66bdd8cf-6f68-48c3-af01-1785234e988f\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.310117 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28mk\" (UniqueName: \"kubernetes.io/projected/be9c3cb6-dc25-447a-923f-9cfabd7b97f3-kube-api-access-l28mk\") pod \"mariadb-operator-controller-manager-88c7-9kptb\" (UID: \"be9c3cb6-dc25-447a-923f-9cfabd7b97f3\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.310174 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggpdd\" (UniqueName: \"kubernetes.io/projected/0f2221a1-54f4-4c60-89b7-1d7759038d5c-kube-api-access-ggpdd\") pod \"manila-operator-controller-manager-6d68dbc695-qktzg\" (UID: \"0f2221a1-54f4-4c60-89b7-1d7759038d5c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.310990 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p859\" (UniqueName: \"kubernetes.io/projected/310bec17-d196-4dd4-926c-817b053f36cc-kube-api-access-2p859\") pod \"nova-operator-controller-manager-64cd67b5cb-jwjmh\" (UID: \"310bec17-d196-4dd4-926c-817b053f36cc\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.311098 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jq6\" (UniqueName: \"kubernetes.io/projected/c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0-kube-api-access-w6jq6\") pod \"octavia-operator-controller-manager-7b787867f4-67mmn\" (UID: \"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.311547 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.327097 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.327723 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.329944 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wskbm" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.335040 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.340923 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.342269 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.347859 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-b765d" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.386015 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.404636 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.412645 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jq6\" (UniqueName: \"kubernetes.io/projected/c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0-kube-api-access-w6jq6\") pod \"octavia-operator-controller-manager-7b787867f4-67mmn\" (UID: \"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.412683 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2w9\" (UniqueName: \"kubernetes.io/projected/66bdd8cf-6f68-48c3-af01-1785234e988f-kube-api-access-4b2w9\") pod \"neutron-operator-controller-manager-849d5b9b84-52g6g\" (UID: \"66bdd8cf-6f68-48c3-af01-1785234e988f\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.412713 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnkf\" (UniqueName: \"kubernetes.io/projected/c87923ae-b391-4bc6-8463-16d2f1c4427b-kube-api-access-xcnkf\") pod \"ovn-operator-controller-manager-9976ff44c-hk8xv\" (UID: \"c87923ae-b391-4bc6-8463-16d2f1c4427b\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.412738 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmh6\" (UniqueName: \"kubernetes.io/projected/b1438f7d-c50e-42ed-b444-8a9d019a886b-kube-api-access-mvmh6\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.414751 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28mk\" (UniqueName: \"kubernetes.io/projected/be9c3cb6-dc25-447a-923f-9cfabd7b97f3-kube-api-access-l28mk\") pod \"mariadb-operator-controller-manager-88c7-9kptb\" (UID: \"be9c3cb6-dc25-447a-923f-9cfabd7b97f3\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.414780 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv2bj\" (UniqueName: \"kubernetes.io/projected/f2a55e76-361d-4a08-88e9-4b7594fda990-kube-api-access-pv2bj\") pod \"placement-operator-controller-manager-589c58c6c-gc9bk\" (UID: \"f2a55e76-361d-4a08-88e9-4b7594fda990\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.414799 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggpdd\" (UniqueName: \"kubernetes.io/projected/0f2221a1-54f4-4c60-89b7-1d7759038d5c-kube-api-access-ggpdd\") pod \"manila-operator-controller-manager-6d68dbc695-qktzg\" (UID: \"0f2221a1-54f4-4c60-89b7-1d7759038d5c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.414848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.414868 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p859\" (UniqueName: \"kubernetes.io/projected/310bec17-d196-4dd4-926c-817b053f36cc-kube-api-access-2p859\") pod \"nova-operator-controller-manager-64cd67b5cb-jwjmh\" (UID: \"310bec17-d196-4dd4-926c-817b053f36cc\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.427702 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.442157 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jq6\" (UniqueName: \"kubernetes.io/projected/c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0-kube-api-access-w6jq6\") pod \"octavia-operator-controller-manager-7b787867f4-67mmn\" (UID: \"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.452929 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggpdd\" (UniqueName: \"kubernetes.io/projected/0f2221a1-54f4-4c60-89b7-1d7759038d5c-kube-api-access-ggpdd\") pod \"manila-operator-controller-manager-6d68dbc695-qktzg\" (UID: \"0f2221a1-54f4-4c60-89b7-1d7759038d5c\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.455180 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.467867 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.488002 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28mk\" (UniqueName: \"kubernetes.io/projected/be9c3cb6-dc25-447a-923f-9cfabd7b97f3-kube-api-access-l28mk\") pod \"mariadb-operator-controller-manager-88c7-9kptb\" (UID: \"be9c3cb6-dc25-447a-923f-9cfabd7b97f3\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.488518 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2w9\" (UniqueName: \"kubernetes.io/projected/66bdd8cf-6f68-48c3-af01-1785234e988f-kube-api-access-4b2w9\") pod \"neutron-operator-controller-manager-849d5b9b84-52g6g\" (UID: \"66bdd8cf-6f68-48c3-af01-1785234e988f\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.490912 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p859\" (UniqueName: \"kubernetes.io/projected/310bec17-d196-4dd4-926c-817b053f36cc-kube-api-access-2p859\") pod \"nova-operator-controller-manager-64cd67b5cb-jwjmh\" (UID: \"310bec17-d196-4dd4-926c-817b053f36cc\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.515301 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.516145 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skcn\" (UniqueName: \"kubernetes.io/projected/5c208ca0-21fb-4313-b865-7d3219f4f180-kube-api-access-2skcn\") pod \"swift-operator-controller-manager-84d6b4b759-j2xw4\" (UID: \"5c208ca0-21fb-4313-b865-7d3219f4f180\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.516177 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnkf\" (UniqueName: \"kubernetes.io/projected/c87923ae-b391-4bc6-8463-16d2f1c4427b-kube-api-access-xcnkf\") pod \"ovn-operator-controller-manager-9976ff44c-hk8xv\" (UID: \"c87923ae-b391-4bc6-8463-16d2f1c4427b\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.516203 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmh6\" (UniqueName: \"kubernetes.io/projected/b1438f7d-c50e-42ed-b444-8a9d019a886b-kube-api-access-mvmh6\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.516230 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv2bj\" (UniqueName: \"kubernetes.io/projected/f2a55e76-361d-4a08-88e9-4b7594fda990-kube-api-access-pv2bj\") pod \"placement-operator-controller-manager-589c58c6c-gc9bk\" (UID: \"f2a55e76-361d-4a08-88e9-4b7594fda990\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.516267 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: E1002 11:27:40.516380 4929 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:27:40 crc kubenswrapper[4929]: E1002 11:27:40.516421 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert podName:b1438f7d-c50e-42ed-b444-8a9d019a886b nodeName:}" failed. No retries permitted until 2025-10-02 11:27:41.01640537 +0000 UTC m=+1061.566771734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert") pod "openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" (UID: "b1438f7d-c50e-42ed-b444-8a9d019a886b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.528145 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.529442 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.537468 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.538862 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2czt7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.545685 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnkf\" (UniqueName: \"kubernetes.io/projected/c87923ae-b391-4bc6-8463-16d2f1c4427b-kube-api-access-xcnkf\") pod \"ovn-operator-controller-manager-9976ff44c-hk8xv\" (UID: \"c87923ae-b391-4bc6-8463-16d2f1c4427b\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.545755 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.550740 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmh6\" (UniqueName: \"kubernetes.io/projected/b1438f7d-c50e-42ed-b444-8a9d019a886b-kube-api-access-mvmh6\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.567713 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.601126 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-j5mr4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.605759 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.608091 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.609418 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-j5mr4"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.615261 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-tbv6v" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.619028 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-426sz\" (UniqueName: \"kubernetes.io/projected/cdadb186-5a8a-4a7c-9dba-9ed850642887-kube-api-access-426sz\") pod \"telemetry-operator-controller-manager-b8d54b5d7-whrgt\" (UID: \"cdadb186-5a8a-4a7c-9dba-9ed850642887\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.619116 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skcn\" (UniqueName: \"kubernetes.io/projected/5c208ca0-21fb-4313-b865-7d3219f4f180-kube-api-access-2skcn\") pod \"swift-operator-controller-manager-84d6b4b759-j2xw4\" (UID: \"5c208ca0-21fb-4313-b865-7d3219f4f180\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.632869 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv2bj\" (UniqueName: \"kubernetes.io/projected/f2a55e76-361d-4a08-88e9-4b7594fda990-kube-api-access-pv2bj\") pod \"placement-operator-controller-manager-589c58c6c-gc9bk\" (UID: \"f2a55e76-361d-4a08-88e9-4b7594fda990\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.633358 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.653244 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.656831 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skcn\" (UniqueName: \"kubernetes.io/projected/5c208ca0-21fb-4313-b865-7d3219f4f180-kube-api-access-2skcn\") pod \"swift-operator-controller-manager-84d6b4b759-j2xw4\" (UID: \"5c208ca0-21fb-4313-b865-7d3219f4f180\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.672380 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.673896 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.680852 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bfw7z" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.688453 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.717619 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.738106 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-426sz\" (UniqueName: \"kubernetes.io/projected/cdadb186-5a8a-4a7c-9dba-9ed850642887-kube-api-access-426sz\") pod \"telemetry-operator-controller-manager-b8d54b5d7-whrgt\" (UID: \"cdadb186-5a8a-4a7c-9dba-9ed850642887\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.738471 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrkpf\" (UniqueName: \"kubernetes.io/projected/06267520-c8b3-4436-8088-2c42fdd5c1d3-kube-api-access-vrkpf\") pod \"test-operator-controller-manager-85777745bb-j5mr4\" (UID: \"06267520-c8b3-4436-8088-2c42fdd5c1d3\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.749606 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.750701 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.755875 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.756613 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hkn64" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.777864 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.785472 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-426sz\" (UniqueName: \"kubernetes.io/projected/cdadb186-5a8a-4a7c-9dba-9ed850642887-kube-api-access-426sz\") pod \"telemetry-operator-controller-manager-b8d54b5d7-whrgt\" (UID: \"cdadb186-5a8a-4a7c-9dba-9ed850642887\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.796758 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.799482 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.818983 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.819945 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.828829 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-q5wzr" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.842235 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jzn\" (UniqueName: \"kubernetes.io/projected/b2d4b0ec-0488-4157-b187-d63fbc4a932b-kube-api-access-v8jzn\") pod \"watcher-operator-controller-manager-6b9957f54f-9csrz\" (UID: \"b2d4b0ec-0488-4157-b187-d63fbc4a932b\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.842355 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.842488 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tt6s\" (UniqueName: \"kubernetes.io/projected/177c2f46-2f18-440c-b119-ee413ae5c4ed-kube-api-access-8tt6s\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.842546 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrkpf\" (UniqueName: \"kubernetes.io/projected/06267520-c8b3-4436-8088-2c42fdd5c1d3-kube-api-access-vrkpf\") pod \"test-operator-controller-manager-85777745bb-j5mr4\" (UID: \"06267520-c8b3-4436-8088-2c42fdd5c1d3\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.849483 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl"] Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.876853 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrkpf\" (UniqueName: \"kubernetes.io/projected/06267520-c8b3-4436-8088-2c42fdd5c1d3-kube-api-access-vrkpf\") pod \"test-operator-controller-manager-85777745bb-j5mr4\" (UID: \"06267520-c8b3-4436-8088-2c42fdd5c1d3\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.900453 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:40 crc kubenswrapper[4929]: W1002 11:27:40.935506 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode640cd16_0a71_4f55_a6bd_154d16e32427.slice/crio-3c57c1d7ca4c8f5794e4419caca687c65131ee3360b398710a2cfb5639625b2f WatchSource:0}: Error finding container 3c57c1d7ca4c8f5794e4419caca687c65131ee3360b398710a2cfb5639625b2f: Status 404 returned error can't find the container with id 3c57c1d7ca4c8f5794e4419caca687c65131ee3360b398710a2cfb5639625b2f Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.958076 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jzn\" (UniqueName: \"kubernetes.io/projected/b2d4b0ec-0488-4157-b187-d63fbc4a932b-kube-api-access-v8jzn\") pod \"watcher-operator-controller-manager-6b9957f54f-9csrz\" (UID: \"b2d4b0ec-0488-4157-b187-d63fbc4a932b\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.976107 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.976419 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tt6s\" (UniqueName: \"kubernetes.io/projected/177c2f46-2f18-440c-b119-ee413ae5c4ed-kube-api-access-8tt6s\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.976549 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfh2\" (UniqueName: \"kubernetes.io/projected/a8a1c59c-9589-4343-b7df-1052d533177e-kube-api-access-cnfh2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-84jbl\" (UID: \"a8a1c59c-9589-4343-b7df-1052d533177e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.973603 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph"] Oct 02 11:27:40 crc kubenswrapper[4929]: E1002 11:27:40.977548 4929 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 11:27:40 crc kubenswrapper[4929]: E1002 11:27:40.977609 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert podName:177c2f46-2f18-440c-b119-ee413ae5c4ed nodeName:}" failed. No retries permitted until 2025-10-02 11:27:41.477590027 +0000 UTC m=+1062.027956391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert") pod "openstack-operator-controller-manager-7c445c66cc-x2c6p" (UID: "177c2f46-2f18-440c-b119-ee413ae5c4ed") : secret "webhook-server-cert" not found Oct 02 11:27:40 crc kubenswrapper[4929]: I1002 11:27:40.980058 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.005597 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tt6s\" (UniqueName: \"kubernetes.io/projected/177c2f46-2f18-440c-b119-ee413ae5c4ed-kube-api-access-8tt6s\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.012054 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jzn\" (UniqueName: \"kubernetes.io/projected/b2d4b0ec-0488-4157-b187-d63fbc4a932b-kube-api-access-v8jzn\") pod \"watcher-operator-controller-manager-6b9957f54f-9csrz\" (UID: \"b2d4b0ec-0488-4157-b187-d63fbc4a932b\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.032317 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.059079 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.079338 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.079397 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfh2\" (UniqueName: \"kubernetes.io/projected/a8a1c59c-9589-4343-b7df-1052d533177e-kube-api-access-cnfh2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-84jbl\" (UID: \"a8a1c59c-9589-4343-b7df-1052d533177e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.121119 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1438f7d-c50e-42ed-b444-8a9d019a886b-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-ww7r7\" (UID: \"b1438f7d-c50e-42ed-b444-8a9d019a886b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.121528 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfh2\" (UniqueName: \"kubernetes.io/projected/a8a1c59c-9589-4343-b7df-1052d533177e-kube-api-access-cnfh2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-84jbl\" (UID: \"a8a1c59c-9589-4343-b7df-1052d533177e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.139592 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.176904 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.261480 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.341352 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" event={"ID":"e640cd16-0a71-4f55-a6bd-154d16e32427","Type":"ContainerStarted","Data":"3c57c1d7ca4c8f5794e4419caca687c65131ee3360b398710a2cfb5639625b2f"} Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.354942 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" event={"ID":"dbbebfcb-a8c5-411f-a280-9d225411602e","Type":"ContainerStarted","Data":"73f70b2232780506ed30ca03bbf74563428aac51c001d43115e897e96a06fd62"} Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.379616 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" event={"ID":"bc30a5a5-e775-4c1d-a644-6072c3b0eeea","Type":"ContainerStarted","Data":"efabf292487fe68afd3c2b03c823a6c2f82b9060fd9cc088a899df800cf13025"} Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.385794 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" event={"ID":"7fc88030-43e9-4a64-a887-f8db65808659","Type":"ContainerStarted","Data":"7aa51983de24355860b06e8a293fd40de9106a279f061071e10c8905c800a6a7"} Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.485134 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.490720 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/177c2f46-2f18-440c-b119-ee413ae5c4ed-cert\") pod \"openstack-operator-controller-manager-7c445c66cc-x2c6p\" (UID: \"177c2f46-2f18-440c-b119-ee413ae5c4ed\") " pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.599038 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.609536 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.613998 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.620886 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.631098 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.637986 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.652399 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.661814 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-9kptb"] Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.665809 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9c3cb6_dc25_447a_923f_9cfabd7b97f3.slice/crio-c0495cf32a54cf5de8dce82ac5ae8999e0e6eb7cb773ef273f4d6ab5806279f2 WatchSource:0}: Error finding container c0495cf32a54cf5de8dce82ac5ae8999e0e6eb7cb773ef273f4d6ab5806279f2: Status 404 returned error can't find the container with id c0495cf32a54cf5de8dce82ac5ae8999e0e6eb7cb773ef273f4d6ab5806279f2 Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.691530 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.724667 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.730080 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.736848 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn"] Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.738948 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87bc3f1_fd85_4e4f_bca1_ac892f48f6f0.slice/crio-c717116227509f4259baca0ca40426079ec2c3050c325a8ce8e8af23c679b048 WatchSource:0}: Error finding container c717116227509f4259baca0ca40426079ec2c3050c325a8ce8e8af23c679b048: Status 404 returned error can't find the container with id c717116227509f4259baca0ca40426079ec2c3050c325a8ce8e8af23c679b048 Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.739854 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66bdd8cf_6f68_48c3_af01_1785234e988f.slice/crio-09268983e415fafd4ac3f57577a08a70ab3b6cc1170417ba14994d54160aeb10 WatchSource:0}: Error finding container 09268983e415fafd4ac3f57577a08a70ab3b6cc1170417ba14994d54160aeb10: Status 404 returned error can't find the container with id 09268983e415fafd4ac3f57577a08a70ab3b6cc1170417ba14994d54160aeb10 Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.742577 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv"] Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.742596 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87923ae_b391_4bc6_8463_16d2f1c4427b.slice/crio-a5f70824290bb721ced2bde9c99a62b5d19abf13a358c12a0954d8c5dd1b3a40 WatchSource:0}: Error finding container a5f70824290bb721ced2bde9c99a62b5d19abf13a358c12a0954d8c5dd1b3a40: Status 404 returned error can't find the container with id a5f70824290bb721ced2bde9c99a62b5d19abf13a358c12a0954d8c5dd1b3a40 Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.743248 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4b2w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-52g6g_openstack-operators(66bdd8cf-6f68-48c3-af01-1785234e988f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.744979 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xcnkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-hk8xv_openstack-operators(c87923ae-b391-4bc6-8463-16d2f1c4427b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.748928 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk"] Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.751430 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a55e76_361d_4a08_88e9_4b7594fda990.slice/crio-2f7baf36549471a3f08d991ce4e3a08e76b24a38765d4d1be08c515f45a806f2 WatchSource:0}: Error finding container 2f7baf36549471a3f08d991ce4e3a08e76b24a38765d4d1be08c515f45a806f2: Status 404 returned error can't find the container with id 2f7baf36549471a3f08d991ce4e3a08e76b24a38765d4d1be08c515f45a806f2 Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.761717 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pv2bj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-gc9bk_openstack-operators(f2a55e76-361d-4a08-88e9-4b7594fda990): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.853300 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.876214 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.888124 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7"] Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.901853 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-j5mr4"] Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.911070 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d4b0ec_0488_4157_b187_d63fbc4a932b.slice/crio-bd4c2c9f14cd3cb85ed6045bf7c54819e44101a7491fcd5424c4b0fc93597bd8 WatchSource:0}: Error finding container bd4c2c9f14cd3cb85ed6045bf7c54819e44101a7491fcd5424c4b0fc93597bd8: Status 404 returned error can't find the container with id bd4c2c9f14cd3cb85ed6045bf7c54819e44101a7491fcd5424c4b0fc93597bd8 Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.911468 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1438f7d_c50e_42ed_b444_8a9d019a886b.slice/crio-dcc55c252a7eb01223dca3fc32383768b851575749dcaa4b2b33169ebb045e3b WatchSource:0}: Error finding container dcc55c252a7eb01223dca3fc32383768b851575749dcaa4b2b33169ebb045e3b: Status 404 returned error can't find the container with id dcc55c252a7eb01223dca3fc32383768b851575749dcaa4b2b33169ebb045e3b Oct 02 11:27:41 crc kubenswrapper[4929]: W1002 11:27:41.914570 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06267520_c8b3_4436_8088_2c42fdd5c1d3.slice/crio-69f1a05b30daed6f9fcad124b0cf5cdf185edd6b4fba2959168dfe46531ab260 WatchSource:0}: Error finding container 69f1a05b30daed6f9fcad124b0cf5cdf185edd6b4fba2959168dfe46531ab260: Status 404 returned error can't find the container with id 69f1a05b30daed6f9fcad124b0cf5cdf185edd6b4fba2959168dfe46531ab260 Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.914952 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvmh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5869cb545-ww7r7_openstack-operators(b1438f7d-c50e-42ed-b444-8a9d019a886b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.915519 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v8jzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-9csrz_openstack-operators(b2d4b0ec-0488-4157-b187-d63fbc4a932b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:41 crc kubenswrapper[4929]: I1002 11:27:41.917259 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl"] Oct 02 11:27:41 crc kubenswrapper[4929]: E1002 11:27:41.919079 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vrkpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-j5mr4_openstack-operators(06267520-c8b3-4436-8088-2c42fdd5c1d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.008696 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cnfh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-84jbl_openstack-operators(a8a1c59c-9589-4343-b7df-1052d533177e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.010303 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" podUID="a8a1c59c-9589-4343-b7df-1052d533177e" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.032090 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p"] Oct 02 11:27:42 crc kubenswrapper[4929]: W1002 11:27:42.067677 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177c2f46_2f18_440c_b119_ee413ae5c4ed.slice/crio-233e7376730277cfe10fa1690142b438e6f70dfa66374b04b593b4b902b9a9f5 WatchSource:0}: Error finding container 233e7376730277cfe10fa1690142b438e6f70dfa66374b04b593b4b902b9a9f5: Status 404 returned error can't find the container with id 233e7376730277cfe10fa1690142b438e6f70dfa66374b04b593b4b902b9a9f5 Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.084296 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" podUID="c87923ae-b391-4bc6-8463-16d2f1c4427b" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.092041 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" podUID="f2a55e76-361d-4a08-88e9-4b7594fda990" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.095802 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" podUID="66bdd8cf-6f68-48c3-af01-1785234e988f" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.325299 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" podUID="b2d4b0ec-0488-4157-b187-d63fbc4a932b" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.326446 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" podUID="b1438f7d-c50e-42ed-b444-8a9d019a886b" Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.341499 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" podUID="06267520-c8b3-4436-8088-2c42fdd5c1d3" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.430903 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" event={"ID":"f2a55e76-361d-4a08-88e9-4b7594fda990","Type":"ContainerStarted","Data":"483281cf951f52c376052795ca2607e6e01eaf948d846dd8655616aac53d97fb"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.430952 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" event={"ID":"f2a55e76-361d-4a08-88e9-4b7594fda990","Type":"ContainerStarted","Data":"2f7baf36549471a3f08d991ce4e3a08e76b24a38765d4d1be08c515f45a806f2"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.438992 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" podUID="f2a55e76-361d-4a08-88e9-4b7594fda990" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.439404 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" event={"ID":"310bec17-d196-4dd4-926c-817b053f36cc","Type":"ContainerStarted","Data":"0ccf382935c064267de9617aae39bee5dab33baa131d04d1f40586ee72e5a574"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.444260 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" event={"ID":"a8a1c59c-9589-4343-b7df-1052d533177e","Type":"ContainerStarted","Data":"2f9ab5fe28722b4bf17b6b9314356b521e7f6927d32e75f733fa0c2f3c221705"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.446693 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" podUID="a8a1c59c-9589-4343-b7df-1052d533177e" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.448942 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" event={"ID":"b2d4b0ec-0488-4157-b187-d63fbc4a932b","Type":"ContainerStarted","Data":"2da26d3cf265b53f9213ba3a09f55cb686b2bfa5d22dc041b3b5984af39556d2"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.448990 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" event={"ID":"b2d4b0ec-0488-4157-b187-d63fbc4a932b","Type":"ContainerStarted","Data":"bd4c2c9f14cd3cb85ed6045bf7c54819e44101a7491fcd5424c4b0fc93597bd8"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.450896 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" podUID="b2d4b0ec-0488-4157-b187-d63fbc4a932b" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.458225 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" event={"ID":"b1438f7d-c50e-42ed-b444-8a9d019a886b","Type":"ContainerStarted","Data":"ee4f211e031dd46f3894f923731af3603daf7e106a9a4e320cbf705f02a62667"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.458270 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" event={"ID":"b1438f7d-c50e-42ed-b444-8a9d019a886b","Type":"ContainerStarted","Data":"dcc55c252a7eb01223dca3fc32383768b851575749dcaa4b2b33169ebb045e3b"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.459524 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" podUID="b1438f7d-c50e-42ed-b444-8a9d019a886b" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.461097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" event={"ID":"177c2f46-2f18-440c-b119-ee413ae5c4ed","Type":"ContainerStarted","Data":"f969d66847ff8dfc677770b3296dd434b2cffde8315b0406e83a26cb0d765682"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.461129 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" event={"ID":"177c2f46-2f18-440c-b119-ee413ae5c4ed","Type":"ContainerStarted","Data":"233e7376730277cfe10fa1690142b438e6f70dfa66374b04b593b4b902b9a9f5"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.462096 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" event={"ID":"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0","Type":"ContainerStarted","Data":"c717116227509f4259baca0ca40426079ec2c3050c325a8ce8e8af23c679b048"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.463953 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" event={"ID":"cdadb186-5a8a-4a7c-9dba-9ed850642887","Type":"ContainerStarted","Data":"a0d06aef1d582214eb7b5bcf6384932f3d26340f1291b9ff074cee0f99dbad9e"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.466901 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" event={"ID":"06267520-c8b3-4436-8088-2c42fdd5c1d3","Type":"ContainerStarted","Data":"9414063bcf15fede86918cd21d7ba4fffa01499b41ddc6ca2752451789ce7b90"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.466939 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" event={"ID":"06267520-c8b3-4436-8088-2c42fdd5c1d3","Type":"ContainerStarted","Data":"69f1a05b30daed6f9fcad124b0cf5cdf185edd6b4fba2959168dfe46531ab260"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.468858 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" event={"ID":"5c208ca0-21fb-4313-b865-7d3219f4f180","Type":"ContainerStarted","Data":"8339567d3b92a9f5ef6d633722aa011037831e4ee168c9350f00717593f17663"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.469865 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" podUID="06267520-c8b3-4436-8088-2c42fdd5c1d3" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.475360 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" event={"ID":"be9c3cb6-dc25-447a-923f-9cfabd7b97f3","Type":"ContainerStarted","Data":"c0495cf32a54cf5de8dce82ac5ae8999e0e6eb7cb773ef273f4d6ab5806279f2"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.477776 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" event={"ID":"315d86d0-376d-45a1-8bb1-bdb533e0a3fd","Type":"ContainerStarted","Data":"7a37694f42165779f5afd0ea02ec067d58c63850ef532d8e3611f43d61ca6a19"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.481814 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" event={"ID":"0f2221a1-54f4-4c60-89b7-1d7759038d5c","Type":"ContainerStarted","Data":"249175dc0d4898758ebd006914e80a8aad9004db70e60bbc51661c296b58a1d7"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.484157 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" event={"ID":"8bc7a024-b6bf-46df-a6db-a27e3de0316b","Type":"ContainerStarted","Data":"5b97a7487b702942f4ba1000a1ef5e69e511a631a5a0d33f6f9cc24bce8cbcfa"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.488761 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" event={"ID":"31731ea3-445c-4949-826f-012b32eb8737","Type":"ContainerStarted","Data":"eccc60bcaf686f159894e108409af7889e2bad71e8c04539b5c74be371f4b99a"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.490375 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" event={"ID":"66bdd8cf-6f68-48c3-af01-1785234e988f","Type":"ContainerStarted","Data":"816ecccca36925290d0338a236151b9453e09f63681c34a51bfce48c5e0fe62d"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.490397 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" event={"ID":"66bdd8cf-6f68-48c3-af01-1785234e988f","Type":"ContainerStarted","Data":"09268983e415fafd4ac3f57577a08a70ab3b6cc1170417ba14994d54160aeb10"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.495648 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" podUID="66bdd8cf-6f68-48c3-af01-1785234e988f" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.499475 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" event={"ID":"d7bca063-4f2b-4430-bdeb-c018ca23445b","Type":"ContainerStarted","Data":"01faf005cb5b8f7196cca10f71be445a49d96b182e768621da8e962580baf796"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.510731 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" event={"ID":"c87923ae-b391-4bc6-8463-16d2f1c4427b","Type":"ContainerStarted","Data":"47a43254a149847be0def4e9be7668e862ccdbce5e3f2592921b266c46d72e79"} Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.510789 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" event={"ID":"c87923ae-b391-4bc6-8463-16d2f1c4427b","Type":"ContainerStarted","Data":"a5f70824290bb721ced2bde9c99a62b5d19abf13a358c12a0954d8c5dd1b3a40"} Oct 02 11:27:42 crc kubenswrapper[4929]: E1002 11:27:42.512557 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" podUID="c87923ae-b391-4bc6-8463-16d2f1c4427b" Oct 02 11:27:42 crc kubenswrapper[4929]: I1002 11:27:42.517823 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" event={"ID":"6c83fc23-d663-403f-a7d2-2f2398b6d0a3","Type":"ContainerStarted","Data":"ff000a078662c9c47cf879ee5dabc930bd817bba0f1f79721bda65e9bcb71748"} Oct 02 11:27:43 crc kubenswrapper[4929]: I1002 11:27:43.535034 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" event={"ID":"177c2f46-2f18-440c-b119-ee413ae5c4ed","Type":"ContainerStarted","Data":"ac9668398c248964b4248291158df5848306b4be4256dd9302a2a856714729d0"} Oct 02 11:27:43 crc kubenswrapper[4929]: I1002 11:27:43.535581 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.536899 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" podUID="b1438f7d-c50e-42ed-b444-8a9d019a886b" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.537582 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" podUID="c87923ae-b391-4bc6-8463-16d2f1c4427b" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.537770 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" podUID="f2a55e76-361d-4a08-88e9-4b7594fda990" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.537823 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" podUID="b2d4b0ec-0488-4157-b187-d63fbc4a932b" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.537933 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" podUID="66bdd8cf-6f68-48c3-af01-1785234e988f" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.538044 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" podUID="06267520-c8b3-4436-8088-2c42fdd5c1d3" Oct 02 11:27:43 crc kubenswrapper[4929]: E1002 11:27:43.543342 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" podUID="a8a1c59c-9589-4343-b7df-1052d533177e" Oct 02 11:27:43 crc kubenswrapper[4929]: I1002 11:27:43.650377 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" podStartSLOduration=3.650359111 podStartE2EDuration="3.650359111s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:27:43.646305611 +0000 UTC m=+1064.196671975" watchObservedRunningTime="2025-10-02 11:27:43.650359111 +0000 UTC m=+1064.200725475" Oct 02 11:27:44 crc kubenswrapper[4929]: I1002 11:27:44.736655 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:44 crc kubenswrapper[4929]: I1002 11:27:44.736730 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:27:44 crc kubenswrapper[4929]: I1002 11:27:44.736784 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:27:44 crc kubenswrapper[4929]: I1002 11:27:44.737561 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:27:44 crc kubenswrapper[4929]: I1002 11:27:44.737626 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616" gracePeriod=600 Oct 02 11:27:45 crc kubenswrapper[4929]: I1002 11:27:45.552012 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616" exitCode=0 Oct 02 11:27:45 crc kubenswrapper[4929]: I1002 11:27:45.552108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616"} Oct 02 11:27:45 crc kubenswrapper[4929]: I1002 11:27:45.552395 4929 scope.go:117] "RemoveContainer" containerID="f87898e72f32d780a00a4311f29a4b41ada294ade544d5a9ece8958a1d5f9fd0" Oct 02 11:27:51 crc kubenswrapper[4929]: I1002 11:27:51.699374 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7c445c66cc-x2c6p" Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.645035 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" event={"ID":"5c208ca0-21fb-4313-b865-7d3219f4f180","Type":"ContainerStarted","Data":"f2e1781b6057a4d044952a3b86fe9a20679c7e028fb1f788214da9d3b05eeeda"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.652759 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" event={"ID":"e640cd16-0a71-4f55-a6bd-154d16e32427","Type":"ContainerStarted","Data":"3bcb5fe266b19f24a2f532d52dbf37c9af56e3c047b3abeb7e0f3fd926ae16de"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.654817 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" event={"ID":"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0","Type":"ContainerStarted","Data":"046882166f9b8e62b3687317c06f345c201bdb5b9869569538ea194e305e4840"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.656454 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" event={"ID":"6c83fc23-d663-403f-a7d2-2f2398b6d0a3","Type":"ContainerStarted","Data":"50a28063574990bc8581798f673a6d3f2f07f49952086a60891f0c83e323658b"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.657508 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" event={"ID":"0f2221a1-54f4-4c60-89b7-1d7759038d5c","Type":"ContainerStarted","Data":"bc36f987d8631550637192bc7cb14feb5e4b3663bc31944939f1046aececc93b"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.658552 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" event={"ID":"dbbebfcb-a8c5-411f-a280-9d225411602e","Type":"ContainerStarted","Data":"801620e8a50696edb4936b5c72000deedaa3d4de7f8e784aec57e259e3982114"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.659928 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" event={"ID":"315d86d0-376d-45a1-8bb1-bdb533e0a3fd","Type":"ContainerStarted","Data":"b1132e4683dcec608ba80a144c6c5cb684888e70724963b24dca6b14e4449f05"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.693820 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" event={"ID":"310bec17-d196-4dd4-926c-817b053f36cc","Type":"ContainerStarted","Data":"d1e0c9ffd9ddb9bb4cb5b3bc9855853184005e6207215a74a86227494d63ecdb"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.696730 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" event={"ID":"7fc88030-43e9-4a64-a887-f8db65808659","Type":"ContainerStarted","Data":"a271ad1f4b51eb0c077041f4aef0263bf0911ccd4e0d04da284160db84cd6fb0"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.696776 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" event={"ID":"7fc88030-43e9-4a64-a887-f8db65808659","Type":"ContainerStarted","Data":"3aaad7090213fd5dedb28b9b6320e9ebaa66c41e2b72a168f8ac5613860d41c6"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.727597 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" event={"ID":"be9c3cb6-dc25-447a-923f-9cfabd7b97f3","Type":"ContainerStarted","Data":"644a7fa6affcf15a2b9de0e8f5a67be0f54bcd346c14da15e98dd9b514a97ea7"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.743696 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" event={"ID":"cdadb186-5a8a-4a7c-9dba-9ed850642887","Type":"ContainerStarted","Data":"203c4fe4d4fc5053b5460d51c8cb456aa5284debaf4e239590d662bdc2fc3d0f"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.751741 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.776281 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" event={"ID":"d7bca063-4f2b-4430-bdeb-c018ca23445b","Type":"ContainerStarted","Data":"95fde60482bafe3839110ac942339afeb4c32e0148eb17386b155a8c7773328d"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.808293 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" event={"ID":"8bc7a024-b6bf-46df-a6db-a27e3de0316b","Type":"ContainerStarted","Data":"ebd2d35104245700bb83f109b335a0d992a7f796a0d0db923acb297ea83152b1"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.854237 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" event={"ID":"31731ea3-445c-4949-826f-012b32eb8737","Type":"ContainerStarted","Data":"0fc48dda90c653f3449695b424be7a811f0f3112f0bc8d515a1e73a2cab62c79"} Oct 02 11:27:55 crc kubenswrapper[4929]: I1002 11:27:55.893193 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" event={"ID":"bc30a5a5-e775-4c1d-a644-6072c3b0eeea","Type":"ContainerStarted","Data":"865e51b713f9d0f900e8a55697d2e77cb4dbf56086f6e25b6e92aea41e03257b"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.935150 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" event={"ID":"31731ea3-445c-4949-826f-012b32eb8737","Type":"ContainerStarted","Data":"e9e4d2aae7d0ec45a974ff9e534ad6070b8f1e0e90e7394106ee17b9296ede96"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.935762 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.942945 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" event={"ID":"be9c3cb6-dc25-447a-923f-9cfabd7b97f3","Type":"ContainerStarted","Data":"a7ff011b9e14524fe4cd54a7a602c017e3b8e92fdadd4d41970f5a309476dd21"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.943103 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.948381 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" event={"ID":"315d86d0-376d-45a1-8bb1-bdb533e0a3fd","Type":"ContainerStarted","Data":"3ce164c10dfb624c2be8061dbc708e12628f1f31dc16a3df988491520aea8a60"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.948887 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.952132 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" event={"ID":"c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0","Type":"ContainerStarted","Data":"81890c925dd988dbfb0339468808cffc9ac82e9e90773ca4f08c3f8c3976f1f7"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.952587 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.959448 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" event={"ID":"0f2221a1-54f4-4c60-89b7-1d7759038d5c","Type":"ContainerStarted","Data":"5630ac52d7bbb67c7e35e7bd8afae8b651666ffc6c492a4b9c8576c573d1aaa0"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.959933 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.962973 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" event={"ID":"d7bca063-4f2b-4430-bdeb-c018ca23445b","Type":"ContainerStarted","Data":"828555414ac1f7a427ad22badc0f89eedc5b4c85b76338e007b5416d202fa3d3"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.963065 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.966671 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" podStartSLOduration=5.075571769 podStartE2EDuration="17.96665496s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.649146307 +0000 UTC m=+1062.199512671" lastFinishedPulling="2025-10-02 11:27:54.540229478 +0000 UTC m=+1075.090595862" observedRunningTime="2025-10-02 11:27:56.961627083 +0000 UTC m=+1077.511993447" watchObservedRunningTime="2025-10-02 11:27:56.96665496 +0000 UTC m=+1077.517021324" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.970420 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" event={"ID":"5c208ca0-21fb-4313-b865-7d3219f4f180","Type":"ContainerStarted","Data":"ec31767467c256340618b68911d68fac30845c1a79ae7f58be501c2b36e3e737"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.970560 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.973429 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" event={"ID":"6c83fc23-d663-403f-a7d2-2f2398b6d0a3","Type":"ContainerStarted","Data":"7f788ab2504f2b62dc0c5f4b3e419ed4323abbb9bfb1d16816f7d239a98a6c74"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.973854 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.976149 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" event={"ID":"e640cd16-0a71-4f55-a6bd-154d16e32427","Type":"ContainerStarted","Data":"ff0aeb5b8402bb0ced044927071c3138ab5793703ec0a48783761119979f039e"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.976295 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.977772 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" podStartSLOduration=5.087926375 podStartE2EDuration="17.977757552s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.638050795 +0000 UTC m=+1062.188417179" lastFinishedPulling="2025-10-02 11:27:54.527881992 +0000 UTC m=+1075.078248356" observedRunningTime="2025-10-02 11:27:56.974137684 +0000 UTC m=+1077.524504048" watchObservedRunningTime="2025-10-02 11:27:56.977757552 +0000 UTC m=+1077.528123916" Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.984253 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" event={"ID":"dbbebfcb-a8c5-411f-a280-9d225411602e","Type":"ContainerStarted","Data":"8f41a301d8ac329e70c08cfabe805fbba9c789507f8330cdf0cd258ae129f762"} Oct 02 11:27:56 crc kubenswrapper[4929]: I1002 11:27:56.984778 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.000091 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" event={"ID":"bc30a5a5-e775-4c1d-a644-6072c3b0eeea","Type":"ContainerStarted","Data":"5203d0782310113456fc717f77edd04378db10abe2dcf692eb44df5353b1eef3"} Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.000729 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.007205 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" podStartSLOduration=5.2217548560000004 podStartE2EDuration="18.007191523s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.742843656 +0000 UTC m=+1062.293210020" lastFinishedPulling="2025-10-02 11:27:54.528280303 +0000 UTC m=+1075.078646687" observedRunningTime="2025-10-02 11:27:57.004692985 +0000 UTC m=+1077.555059349" watchObservedRunningTime="2025-10-02 11:27:57.007191523 +0000 UTC m=+1077.557557887" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.016541 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" event={"ID":"310bec17-d196-4dd4-926c-817b053f36cc","Type":"ContainerStarted","Data":"ed2bf4d3f5d84f61a5d6ae42c5a9c04407671cf70d82b83a46f798dce295af8e"} Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.017263 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.025814 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" event={"ID":"8bc7a024-b6bf-46df-a6db-a27e3de0316b","Type":"ContainerStarted","Data":"4350fc90036336241807833cd18a65752d8b1c55939c00f869d6a71b11e15154"} Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.026460 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.032744 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" event={"ID":"cdadb186-5a8a-4a7c-9dba-9ed850642887","Type":"ContainerStarted","Data":"84f4559e0e31306bf09da093e9e7c25f33a122d88ba9978b2a6bb493072cdb41"} Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.033759 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.033774 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.037894 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" podStartSLOduration=5.143768045 podStartE2EDuration="18.037880198s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.634717285 +0000 UTC m=+1062.185083649" lastFinishedPulling="2025-10-02 11:27:54.528829438 +0000 UTC m=+1075.079195802" observedRunningTime="2025-10-02 11:27:57.030929179 +0000 UTC m=+1077.581295543" watchObservedRunningTime="2025-10-02 11:27:57.037880198 +0000 UTC m=+1077.588246562" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.051880 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" podStartSLOduration=5.194668289 podStartE2EDuration="18.051862648s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.669581433 +0000 UTC m=+1062.219947797" lastFinishedPulling="2025-10-02 11:27:54.526775792 +0000 UTC m=+1075.077142156" observedRunningTime="2025-10-02 11:27:57.04934688 +0000 UTC m=+1077.599713244" watchObservedRunningTime="2025-10-02 11:27:57.051862648 +0000 UTC m=+1077.602229012" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.071485 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" podStartSLOduration=5.19761181 podStartE2EDuration="18.071465812s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.65401014 +0000 UTC m=+1062.204376504" lastFinishedPulling="2025-10-02 11:27:54.527864142 +0000 UTC m=+1075.078230506" observedRunningTime="2025-10-02 11:27:57.06514616 +0000 UTC m=+1077.615512534" watchObservedRunningTime="2025-10-02 11:27:57.071465812 +0000 UTC m=+1077.621832176" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.106540 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" podStartSLOduration=4.536755971 podStartE2EDuration="18.106524536s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:40.956036071 +0000 UTC m=+1061.506402435" lastFinishedPulling="2025-10-02 11:27:54.525804636 +0000 UTC m=+1075.076171000" observedRunningTime="2025-10-02 11:27:57.082162563 +0000 UTC m=+1077.632528927" watchObservedRunningTime="2025-10-02 11:27:57.106524536 +0000 UTC m=+1077.656890900" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.111354 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" podStartSLOduration=5.24801014 podStartE2EDuration="18.111337126s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.664068503 +0000 UTC m=+1062.214434857" lastFinishedPulling="2025-10-02 11:27:54.527395479 +0000 UTC m=+1075.077761843" observedRunningTime="2025-10-02 11:27:57.105665702 +0000 UTC m=+1077.656032086" watchObservedRunningTime="2025-10-02 11:27:57.111337126 +0000 UTC m=+1077.661703490" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.132626 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" podStartSLOduration=4.322064176 podStartE2EDuration="17.132607735s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.734482579 +0000 UTC m=+1062.284848943" lastFinishedPulling="2025-10-02 11:27:54.545026098 +0000 UTC m=+1075.095392502" observedRunningTime="2025-10-02 11:27:57.125774179 +0000 UTC m=+1077.676140543" watchObservedRunningTime="2025-10-02 11:27:57.132607735 +0000 UTC m=+1077.682974099" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.174656 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" podStartSLOduration=5.272402965 podStartE2EDuration="18.174638349s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.638189049 +0000 UTC m=+1062.188555413" lastFinishedPulling="2025-10-02 11:27:54.540424433 +0000 UTC m=+1075.090790797" observedRunningTime="2025-10-02 11:27:57.154888741 +0000 UTC m=+1077.705255105" watchObservedRunningTime="2025-10-02 11:27:57.174638349 +0000 UTC m=+1077.725004713" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.181332 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" podStartSLOduration=4.875790464 podStartE2EDuration="18.18132118s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.223320412 +0000 UTC m=+1061.773686776" lastFinishedPulling="2025-10-02 11:27:54.528851118 +0000 UTC m=+1075.079217492" observedRunningTime="2025-10-02 11:27:57.173546089 +0000 UTC m=+1077.723912453" watchObservedRunningTime="2025-10-02 11:27:57.18132118 +0000 UTC m=+1077.731687544" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.200274 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" podStartSLOduration=4.76427871 podStartE2EDuration="18.200256516s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:40.820843772 +0000 UTC m=+1061.371210126" lastFinishedPulling="2025-10-02 11:27:54.256821568 +0000 UTC m=+1074.807187932" observedRunningTime="2025-10-02 11:27:57.197242094 +0000 UTC m=+1077.747608458" watchObservedRunningTime="2025-10-02 11:27:57.200256516 +0000 UTC m=+1077.750622870" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.217989 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" podStartSLOduration=4.58172915 podStartE2EDuration="17.217949197s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.892845807 +0000 UTC m=+1062.443212171" lastFinishedPulling="2025-10-02 11:27:54.529065854 +0000 UTC m=+1075.079432218" observedRunningTime="2025-10-02 11:27:57.214200445 +0000 UTC m=+1077.764566799" watchObservedRunningTime="2025-10-02 11:27:57.217949197 +0000 UTC m=+1077.768315561" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.269585 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" podStartSLOduration=5.401581269 podStartE2EDuration="18.269561071s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.659192751 +0000 UTC m=+1062.209559115" lastFinishedPulling="2025-10-02 11:27:54.527172533 +0000 UTC m=+1075.077538917" observedRunningTime="2025-10-02 11:27:57.237261772 +0000 UTC m=+1077.787628136" watchObservedRunningTime="2025-10-02 11:27:57.269561071 +0000 UTC m=+1077.819927435" Oct 02 11:27:57 crc kubenswrapper[4929]: I1002 11:27:57.270149 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" podStartSLOduration=4.697281409 podStartE2EDuration="18.270139637s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:40.955480546 +0000 UTC m=+1061.505846910" lastFinishedPulling="2025-10-02 11:27:54.528338774 +0000 UTC m=+1075.078705138" observedRunningTime="2025-10-02 11:27:57.261736638 +0000 UTC m=+1077.812103002" watchObservedRunningTime="2025-10-02 11:27:57.270139637 +0000 UTC m=+1077.820506001" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.109087 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-w4lnt" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.118777 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-vbcph" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.170868 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-jvvpf" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.253312 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-9qljm" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.282502 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-lpvb9" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.301261 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-kg8nv" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.342325 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-s5528" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.390832 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-7q5cl" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.409091 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-6rxh4" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.518072 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-qktzg" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.544097 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-9kptb" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.576225 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jwjmh" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.648040 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-67mmn" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.802625 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j2xw4" Oct 02 11:28:00 crc kubenswrapper[4929]: I1002 11:28:00.903360 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-whrgt" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.077985 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" event={"ID":"b2d4b0ec-0488-4157-b187-d63fbc4a932b","Type":"ContainerStarted","Data":"b54dd8da5411f5a989aaf8eb6f263bd02961ef4e410ff55f8262ac80f8832a0e"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.078415 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.080545 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" event={"ID":"c87923ae-b391-4bc6-8463-16d2f1c4427b","Type":"ContainerStarted","Data":"adfb018eb579a691a16c444c4ad2b4d5221779e376fad99adf7f93581486d72c"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.080698 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.082520 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" event={"ID":"66bdd8cf-6f68-48c3-af01-1785234e988f","Type":"ContainerStarted","Data":"d42b901507c52ca3bbf860865423faa37038386220f5a2e59184be8a70a58a29"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.082677 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.084550 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" event={"ID":"b1438f7d-c50e-42ed-b444-8a9d019a886b","Type":"ContainerStarted","Data":"5dc2ef3e407d4269b0ff8604528a577bb71e8077975be1fb80700354e027157b"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.084944 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.086384 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" event={"ID":"f2a55e76-361d-4a08-88e9-4b7594fda990","Type":"ContainerStarted","Data":"bf27a92b38ae3456cc6245ed70095c0183a5aa7b31af35b5c4a2381782535d2e"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.086642 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.088346 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" event={"ID":"06267520-c8b3-4436-8088-2c42fdd5c1d3","Type":"ContainerStarted","Data":"e83bcbb1a48e1db4371d7ad903e78cc035e688caf2e4c6a3f2be3bbb9d9c0279"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.088579 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.090655 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" event={"ID":"a8a1c59c-9589-4343-b7df-1052d533177e","Type":"ContainerStarted","Data":"e682f520647d4f75d4038c05fd0c6e1637edb2284c92b82cd2b9e0b9ee259c2a"} Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.121309 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" podStartSLOduration=2.719904038 podStartE2EDuration="23.121293132s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.915437432 +0000 UTC m=+1062.465803796" lastFinishedPulling="2025-10-02 11:28:02.316826526 +0000 UTC m=+1082.867192890" observedRunningTime="2025-10-02 11:28:03.103877048 +0000 UTC m=+1083.654243412" watchObservedRunningTime="2025-10-02 11:28:03.121293132 +0000 UTC m=+1083.671659496" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.124660 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" podStartSLOduration=3.507427919 podStartE2EDuration="24.124648544s" podCreationTimestamp="2025-10-02 11:27:39 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.743139155 +0000 UTC m=+1062.293505519" lastFinishedPulling="2025-10-02 11:28:02.36035978 +0000 UTC m=+1082.910726144" observedRunningTime="2025-10-02 11:28:03.117585221 +0000 UTC m=+1083.667951585" watchObservedRunningTime="2025-10-02 11:28:03.124648544 +0000 UTC m=+1083.675014908" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.146066 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" podStartSLOduration=3.51058571 podStartE2EDuration="23.146048036s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.744883022 +0000 UTC m=+1062.295249386" lastFinishedPulling="2025-10-02 11:28:01.380345348 +0000 UTC m=+1081.930711712" observedRunningTime="2025-10-02 11:28:03.143408864 +0000 UTC m=+1083.693775228" watchObservedRunningTime="2025-10-02 11:28:03.146048036 +0000 UTC m=+1083.696414400" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.162862 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-84jbl" podStartSLOduration=2.792048161 podStartE2EDuration="23.162844073s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:42.008556905 +0000 UTC m=+1062.558923269" lastFinishedPulling="2025-10-02 11:28:02.379352817 +0000 UTC m=+1082.929719181" observedRunningTime="2025-10-02 11:28:03.157994211 +0000 UTC m=+1083.708360575" watchObservedRunningTime="2025-10-02 11:28:03.162844073 +0000 UTC m=+1083.713210437" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.199946 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" podStartSLOduration=3.274227469 podStartE2EDuration="23.199924601s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.918856115 +0000 UTC m=+1062.469222469" lastFinishedPulling="2025-10-02 11:28:01.844553237 +0000 UTC m=+1082.394919601" observedRunningTime="2025-10-02 11:28:03.179038353 +0000 UTC m=+1083.729404717" watchObservedRunningTime="2025-10-02 11:28:03.199924601 +0000 UTC m=+1083.750290975" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.204294 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" podStartSLOduration=2.605459195 podStartE2EDuration="23.20427004s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.761568386 +0000 UTC m=+1062.311934750" lastFinishedPulling="2025-10-02 11:28:02.360379231 +0000 UTC m=+1082.910745595" observedRunningTime="2025-10-02 11:28:03.194163205 +0000 UTC m=+1083.744529569" watchObservedRunningTime="2025-10-02 11:28:03.20427004 +0000 UTC m=+1083.754636414" Oct 02 11:28:03 crc kubenswrapper[4929]: I1002 11:28:03.225892 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" podStartSLOduration=2.77658405 podStartE2EDuration="23.225872557s" podCreationTimestamp="2025-10-02 11:27:40 +0000 UTC" firstStartedPulling="2025-10-02 11:27:41.914335292 +0000 UTC m=+1062.464701656" lastFinishedPulling="2025-10-02 11:28:02.363623799 +0000 UTC m=+1082.913990163" observedRunningTime="2025-10-02 11:28:03.222593768 +0000 UTC m=+1083.772960142" watchObservedRunningTime="2025-10-02 11:28:03.225872557 +0000 UTC m=+1083.776238921" Oct 02 11:28:10 crc kubenswrapper[4929]: I1002 11:28:10.608151 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-52g6g" Oct 02 11:28:10 crc kubenswrapper[4929]: I1002 11:28:10.662001 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-hk8xv" Oct 02 11:28:10 crc kubenswrapper[4929]: I1002 11:28:10.720798 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-gc9bk" Oct 02 11:28:11 crc kubenswrapper[4929]: I1002 11:28:11.036779 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-j5mr4" Oct 02 11:28:11 crc kubenswrapper[4929]: I1002 11:28:11.060065 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-9csrz" Oct 02 11:28:11 crc kubenswrapper[4929]: I1002 11:28:11.266734 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-ww7r7" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.875830 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.879772 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.881731 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.885153 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lmrsq" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.886785 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.887560 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.888391 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.915844 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.918304 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:24 crc kubenswrapper[4929]: W1002 11:28:24.923911 4929 reflector.go:561] object-"openstack"/"dns-svc": failed to list *v1.ConfigMap: configmaps "dns-svc" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 02 11:28:24 crc kubenswrapper[4929]: E1002 11:28:24.924010 4929 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dns-svc\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns-svc\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.939399 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.939518 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m254k\" (UniqueName: \"kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:24 crc kubenswrapper[4929]: I1002 11:28:24.963616 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.040320 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.040422 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.040459 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499z7\" (UniqueName: \"kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.040490 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m254k\" (UniqueName: \"kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.040511 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.041349 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.058725 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m254k\" (UniqueName: \"kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k\") pod \"dnsmasq-dns-675f4bcbfc-p925h\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.142184 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-499z7\" (UniqueName: \"kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.142251 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.142291 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.143330 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.159701 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-499z7\" (UniqueName: \"kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.196945 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.439869 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:25 crc kubenswrapper[4929]: I1002 11:28:25.456112 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:28:26 crc kubenswrapper[4929]: E1002 11:28:26.144612 4929 configmap.go:193] Couldn't get configMap openstack/dns-svc: failed to sync configmap cache: timed out waiting for the condition Oct 02 11:28:26 crc kubenswrapper[4929]: E1002 11:28:26.145231 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc podName:15485998-b208-475d-b264-84c43fa3ae5d nodeName:}" failed. No retries permitted until 2025-10-02 11:28:26.645201246 +0000 UTC m=+1107.195567610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc") pod "dnsmasq-dns-78dd6ddcc-4k6zw" (UID: "15485998-b208-475d-b264-84c43fa3ae5d") : failed to sync configmap cache: timed out waiting for the condition Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.190679 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.260414 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" event={"ID":"965e17b3-8442-4b01-8f97-acb5dc284577","Type":"ContainerStarted","Data":"8059cb3f8b58f0ad40d68ea2e962aa5baef2836a02d4f26e50fe13df38283cb6"} Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.664218 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.665846 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4k6zw\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.761787 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:26 crc kubenswrapper[4929]: I1002 11:28:26.991040 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.152522 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.195365 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.196861 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.270432 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" event={"ID":"15485998-b208-475d-b264-84c43fa3ae5d","Type":"ContainerStarted","Data":"b5a6586798edb22c223e3012fc1789b03ba64fa054a6ae45be6a64f758d3be41"} Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.272458 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdckm\" (UniqueName: \"kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.272524 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.272601 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.275944 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.374072 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.374156 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdckm\" (UniqueName: \"kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.374186 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.374965 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.374966 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.393866 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdckm\" (UniqueName: \"kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm\") pod \"dnsmasq-dns-666b6646f7-z6qpv\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.516647 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.628865 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.660960 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.663067 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.678079 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.778388 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwcp\" (UniqueName: \"kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.778584 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.778642 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.881217 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwcp\" (UniqueName: \"kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.881331 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.881386 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.882740 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.882780 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.922826 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwcp\" (UniqueName: \"kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp\") pod \"dnsmasq-dns-57d769cc4f-8skmc\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:27 crc kubenswrapper[4929]: I1002 11:28:27.989949 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.056256 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.289448 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" event={"ID":"4132f319-3a7f-40df-a4bb-ffc9a51908da","Type":"ContainerStarted","Data":"2b32e1828b702930a4f376a180753c09494b75681d07abb15d65a40b7d966fbc"} Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.428713 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.430189 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.432221 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.438715 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.439316 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.439917 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x5c5f" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.440291 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.440572 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.440826 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.468145 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.485473 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491108 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491268 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491398 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491418 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491470 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrnt\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491528 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491587 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491717 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491779 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491799 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.491836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593652 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593700 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593717 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593734 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593771 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593803 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593836 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593855 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593872 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrnt\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593894 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.593913 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.594989 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.597378 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.598400 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.598513 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.598655 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.599385 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.600018 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.600612 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.601674 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.609266 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.614132 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrnt\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.625239 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.767690 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.855384 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.857093 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861026 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861145 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861220 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861304 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861402 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.861477 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.867456 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4zxj9" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.873867 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.899751 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900052 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxf2\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900078 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900122 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900162 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900212 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900234 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900302 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900323 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:28 crc kubenswrapper[4929]: I1002 11:28:28.900342 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.001902 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.001958 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002029 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002051 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxf2\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002072 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002098 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002114 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002138 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002182 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002206 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.002652 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.003143 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.003387 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.003508 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.003881 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.007164 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.009225 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.009663 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.019147 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.022923 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.042732 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxf2\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.044496 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.189613 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.266552 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:28:29 crc kubenswrapper[4929]: W1002 11:28:29.273447 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe704e8e_9b46_4dfb_9363_278e61720eaa.slice/crio-bf7aba24704181a1c38632ffbceff0aea2923bdada51a9ab0d797148ce5f7bb4 WatchSource:0}: Error finding container bf7aba24704181a1c38632ffbceff0aea2923bdada51a9ab0d797148ce5f7bb4: Status 404 returned error can't find the container with id bf7aba24704181a1c38632ffbceff0aea2923bdada51a9ab0d797148ce5f7bb4 Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.296904 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerStarted","Data":"bf7aba24704181a1c38632ffbceff0aea2923bdada51a9ab0d797148ce5f7bb4"} Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.298085 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" event={"ID":"3469a150-cd5c-4ef0-8eea-b9803a3175af","Type":"ContainerStarted","Data":"a64860d00add1d34029f4850fe6c9cf7bfd704bd769e89df732b3db5854f0b4e"} Oct 02 11:28:29 crc kubenswrapper[4929]: I1002 11:28:29.683976 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:28:29 crc kubenswrapper[4929]: W1002 11:28:29.727213 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb673e7_59bc_41d1_9bf0_d20527c4a740.slice/crio-54c0548cb2876ce82bba03d6ef6e8eaf0d8bb581208aa19783a534ab65ab4c5c WatchSource:0}: Error finding container 54c0548cb2876ce82bba03d6ef6e8eaf0d8bb581208aa19783a534ab65ab4c5c: Status 404 returned error can't find the container with id 54c0548cb2876ce82bba03d6ef6e8eaf0d8bb581208aa19783a534ab65ab4c5c Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.316046 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerStarted","Data":"54c0548cb2876ce82bba03d6ef6e8eaf0d8bb581208aa19783a534ab65ab4c5c"} Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.676836 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.678307 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.680041 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.680343 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.680533 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.681059 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.681210 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kdj6n" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.700282 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.727777 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.841949 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842081 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmj8d\" (UniqueName: \"kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842139 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842170 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842189 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842228 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842272 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842295 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.842318 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946173 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946584 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946707 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946790 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946836 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmj8d\" (UniqueName: \"kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946900 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946934 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.946950 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.947577 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.947615 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.947987 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.948035 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.954168 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.956251 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.958881 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.959928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:30 crc kubenswrapper[4929]: I1002 11:28:30.967074 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmj8d\" (UniqueName: \"kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.028851 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " pod="openstack/openstack-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.331581 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.685405 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.687799 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.693526 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.693754 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4n9fb" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.693865 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.696113 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.698159 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.780205 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:28:31 crc kubenswrapper[4929]: W1002 11:28:31.797242 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978200e0_025d_4000_baed_4ba85bf83c60.slice/crio-dd72fbe7680edf1cfb1f1f34ca1f15a94207241af7cce5c63433d5fe23113c0c WatchSource:0}: Error finding container dd72fbe7680edf1cfb1f1f34ca1f15a94207241af7cce5c63433d5fe23113c0c: Status 404 returned error can't find the container with id dd72fbe7680edf1cfb1f1f34ca1f15a94207241af7cce5c63433d5fe23113c0c Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.834888 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.838583 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.842375 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vzwdh" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.842434 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.842598 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.848613 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.866919 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.866992 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867223 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867350 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867385 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867408 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867436 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867452 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.867469 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmq7n\" (UniqueName: \"kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.968875 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6n44\" (UniqueName: \"kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.968923 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.968952 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969007 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969037 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969072 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969096 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969118 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmq7n\" (UniqueName: \"kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969168 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969195 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969227 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969260 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969299 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.969328 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.970311 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.970495 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.972032 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.976263 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.977061 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.977806 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.980467 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.985444 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:31 crc kubenswrapper[4929]: I1002 11:28:31.991638 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmq7n\" (UniqueName: \"kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.022782 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.071581 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.071646 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.071696 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.071766 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n44\" (UniqueName: \"kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.071808 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.072387 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.074473 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.075519 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.078567 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.094352 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6n44\" (UniqueName: \"kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44\") pod \"memcached-0\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.163284 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.319678 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:28:32 crc kubenswrapper[4929]: I1002 11:28:32.345355 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerStarted","Data":"dd72fbe7680edf1cfb1f1f34ca1f15a94207241af7cce5c63433d5fe23113c0c"} Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.595123 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.605917 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.610118 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jwh5n" Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.617911 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.711907 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnf6\" (UniqueName: \"kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6\") pod \"kube-state-metrics-0\" (UID: \"469da009-8740-4581-90b5-1e99b80a7f81\") " pod="openstack/kube-state-metrics-0" Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.813139 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnf6\" (UniqueName: \"kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6\") pod \"kube-state-metrics-0\" (UID: \"469da009-8740-4581-90b5-1e99b80a7f81\") " pod="openstack/kube-state-metrics-0" Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.854061 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnf6\" (UniqueName: \"kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6\") pod \"kube-state-metrics-0\" (UID: \"469da009-8740-4581-90b5-1e99b80a7f81\") " pod="openstack/kube-state-metrics-0" Oct 02 11:28:33 crc kubenswrapper[4929]: I1002 11:28:33.925210 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.225134 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.228084 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.236858 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.237166 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-twwcg" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.237404 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.238456 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.255006 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.257938 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.265335 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390521 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390553 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390643 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390674 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390700 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsb4b\" (UniqueName: \"kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390747 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czpkw\" (UniqueName: \"kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390777 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390826 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390845 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390873 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.390909 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491767 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491820 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491843 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsb4b\" (UniqueName: \"kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491881 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czpkw\" (UniqueName: \"kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491908 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491925 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491941 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.491975 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.492005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.492021 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.492037 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.492055 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.492083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.493243 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.494282 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.494922 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.498831 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.500064 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.508198 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.515237 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czpkw\" (UniqueName: \"kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.516623 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsb4b\" (UniqueName: \"kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.517285 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.517478 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.517580 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.517696 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log\") pod \"ovn-controller-ovs-fv8ff\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.518562 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn\") pod \"ovn-controller-8kqgz\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.568450 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.596339 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.763933 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.765244 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.768912 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.769233 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.769293 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.769577 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-krs2q" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.769365 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.780043 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.898522 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.898599 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.898731 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.898852 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.899022 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.899054 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.899205 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26fx6\" (UniqueName: \"kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:38 crc kubenswrapper[4929]: I1002 11:28:38.899243 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.000680 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.000764 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.000785 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.001455 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.001596 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26fx6\" (UniqueName: \"kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.001625 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.003722 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.001656 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.003871 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.012185 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.005475 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.006705 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.005640 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.012456 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.013236 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.021548 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26fx6\" (UniqueName: \"kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.036724 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:39 crc kubenswrapper[4929]: I1002 11:28:39.084523 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.334797 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.341210 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.344601 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vhgqd" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.344659 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.344661 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.344919 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.349649 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.439580 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.439652 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.439832 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.440100 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.440152 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdvn\" (UniqueName: \"kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.440181 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.440386 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.440417 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541415 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541467 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdvn\" (UniqueName: \"kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541489 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541587 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541633 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541662 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541693 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.541857 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.542244 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.542574 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.542747 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.548926 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.552502 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.555224 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.562017 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.566899 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdvn\" (UniqueName: \"kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn\") pod \"ovsdbserver-sb-0\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:40 crc kubenswrapper[4929]: I1002 11:28:40.666536 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:28:45 crc kubenswrapper[4929]: E1002 11:28:45.420121 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 02 11:28:45 crc kubenswrapper[4929]: E1002 11:28:45.420573 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqrnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(be704e8e-9b46-4dfb-9363-278e61720eaa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:28:45 crc kubenswrapper[4929]: E1002 11:28:45.422089 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" Oct 02 11:28:45 crc kubenswrapper[4929]: E1002 11:28:45.446483 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" Oct 02 11:28:47 crc kubenswrapper[4929]: E1002 11:28:47.402667 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:28:47 crc kubenswrapper[4929]: E1002 11:28:47.403141 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m254k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-p925h_openstack(965e17b3-8442-4b01-8f97-acb5dc284577): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:28:47 crc kubenswrapper[4929]: E1002 11:28:47.404439 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" podUID="965e17b3-8442-4b01-8f97-acb5dc284577" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.229264 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.230103 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-499z7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4k6zw_openstack(15485998-b208-475d-b264-84c43fa3ae5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.231354 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" podUID="15485998-b208-475d-b264-84c43fa3ae5d" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.253806 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.253976 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qdwcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-8skmc_openstack(3469a150-cd5c-4ef0-8eea-b9803a3175af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.255372 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" podUID="3469a150-cd5c-4ef0-8eea-b9803a3175af" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.270483 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.270647 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdckm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-z6qpv_openstack(4132f319-3a7f-40df-a4bb-ffc9a51908da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.271774 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" podUID="4132f319-3a7f-40df-a4bb-ffc9a51908da" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.326111 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.480671 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config\") pod \"965e17b3-8442-4b01-8f97-acb5dc284577\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.480877 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m254k\" (UniqueName: \"kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k\") pod \"965e17b3-8442-4b01-8f97-acb5dc284577\" (UID: \"965e17b3-8442-4b01-8f97-acb5dc284577\") " Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.482234 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config" (OuterVolumeSpecName: "config") pod "965e17b3-8442-4b01-8f97-acb5dc284577" (UID: "965e17b3-8442-4b01-8f97-acb5dc284577"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.485591 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.485738 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p925h" event={"ID":"965e17b3-8442-4b01-8f97-acb5dc284577","Type":"ContainerDied","Data":"8059cb3f8b58f0ad40d68ea2e962aa5baef2836a02d4f26e50fe13df38283cb6"} Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.486636 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k" (OuterVolumeSpecName: "kube-api-access-m254k") pod "965e17b3-8442-4b01-8f97-acb5dc284577" (UID: "965e17b3-8442-4b01-8f97-acb5dc284577"). InnerVolumeSpecName "kube-api-access-m254k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.492397 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" podUID="3469a150-cd5c-4ef0-8eea-b9803a3175af" Oct 02 11:28:49 crc kubenswrapper[4929]: E1002 11:28:49.492598 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" podUID="4132f319-3a7f-40df-a4bb-ffc9a51908da" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.583952 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m254k\" (UniqueName: \"kubernetes.io/projected/965e17b3-8442-4b01-8f97-acb5dc284577-kube-api-access-m254k\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.584001 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965e17b3-8442-4b01-8f97-acb5dc284577-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.846757 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:28:49 crc kubenswrapper[4929]: W1002 11:28:49.870441 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c5fe9e_033d_4c3b_a71f_e2c215add4c5.slice/crio-275d09b21fbaf637c8ff4e1551f86439e4b28c0a701f2cf9546a5d93df1d005b WatchSource:0}: Error finding container 275d09b21fbaf637c8ff4e1551f86439e4b28c0a701f2cf9546a5d93df1d005b: Status 404 returned error can't find the container with id 275d09b21fbaf637c8ff4e1551f86439e4b28c0a701f2cf9546a5d93df1d005b Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.904764 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.919026 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p925h"] Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.979498 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:28:49 crc kubenswrapper[4929]: I1002 11:28:49.983549 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:28:49 crc kubenswrapper[4929]: W1002 11:28:49.998099 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469da009_8740_4581_90b5_1e99b80a7f81.slice/crio-33198bda049795a7c7dd988fabccec3edfb18ccb9fc422050c86f615c36fb3f9 WatchSource:0}: Error finding container 33198bda049795a7c7dd988fabccec3edfb18ccb9fc422050c86f615c36fb3f9: Status 404 returned error can't find the container with id 33198bda049795a7c7dd988fabccec3edfb18ccb9fc422050c86f615c36fb3f9 Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.049020 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.063677 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.169201 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="965e17b3-8442-4b01-8f97-acb5dc284577" path="/var/lib/kubelet/pods/965e17b3-8442-4b01-8f97-acb5dc284577/volumes" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.205748 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-499z7\" (UniqueName: \"kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7\") pod \"15485998-b208-475d-b264-84c43fa3ae5d\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.205979 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config\") pod \"15485998-b208-475d-b264-84c43fa3ae5d\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.206011 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") pod \"15485998-b208-475d-b264-84c43fa3ae5d\" (UID: \"15485998-b208-475d-b264-84c43fa3ae5d\") " Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.206518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config" (OuterVolumeSpecName: "config") pod "15485998-b208-475d-b264-84c43fa3ae5d" (UID: "15485998-b208-475d-b264-84c43fa3ae5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.206787 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15485998-b208-475d-b264-84c43fa3ae5d" (UID: "15485998-b208-475d-b264-84c43fa3ae5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.209940 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7" (OuterVolumeSpecName: "kube-api-access-499z7") pod "15485998-b208-475d-b264-84c43fa3ae5d" (UID: "15485998-b208-475d-b264-84c43fa3ae5d"). InnerVolumeSpecName "kube-api-access-499z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.310275 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-499z7\" (UniqueName: \"kubernetes.io/projected/15485998-b208-475d-b264-84c43fa3ae5d-kube-api-access-499z7\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.310326 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.310343 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15485998-b208-475d-b264-84c43fa3ae5d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.493501 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506710 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506753 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerStarted","Data":"f9bea5dcf2e3eda55b2a4b8dd153572fc11deb39fa45a41ab37dcfd6e9d95f97"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4k6zw" event={"ID":"15485998-b208-475d-b264-84c43fa3ae5d","Type":"ContainerDied","Data":"b5a6586798edb22c223e3012fc1789b03ba64fa054a6ae45be6a64f758d3be41"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506804 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56c5fe9e-033d-4c3b-a71f-e2c215add4c5","Type":"ContainerStarted","Data":"275d09b21fbaf637c8ff4e1551f86439e4b28c0a701f2cf9546a5d93df1d005b"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506816 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerStarted","Data":"ffd7c7ace908e9ba79b8e86f1630c30d209c74656698a0c176f90bf3cabe102d"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506828 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469da009-8740-4581-90b5-1e99b80a7f81","Type":"ContainerStarted","Data":"33198bda049795a7c7dd988fabccec3edfb18ccb9fc422050c86f615c36fb3f9"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506842 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.506855 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerStarted","Data":"9b31c710f5e16531b1e61137b047da65ed86c42222822c83a63e2d292b03a7f8"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.507050 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerStarted","Data":"0af63ab6c39474b0cbbc4d5e79bcc89441bf8fd2cd3ab19fdedfebef27a37122"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.507099 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerStarted","Data":"ec152cf5812d006e96ec0a494585fc49440422e421eb3a4d079d7093860d86a1"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.508169 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz" event={"ID":"752197b6-8008-4699-895b-4cbf3d475e96","Type":"ContainerStarted","Data":"55f0d042af1930cc8212ac7acfbb68a154146277736cbbfb00842d288dca9c54"} Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.554143 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.561908 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4k6zw"] Oct 02 11:28:50 crc kubenswrapper[4929]: I1002 11:28:50.987559 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:28:51 crc kubenswrapper[4929]: I1002 11:28:51.517459 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerStarted","Data":"dcb01c0ec91fa8b636cd159dd6d4fbe9815deb68d2051731a33d12b7eda329bb"} Oct 02 11:28:51 crc kubenswrapper[4929]: I1002 11:28:51.518813 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerStarted","Data":"cd475f3b9f729cb1b71f3428fd7ec5c6534a3c592ebeea0d48699cbcb45c3273"} Oct 02 11:28:52 crc kubenswrapper[4929]: I1002 11:28:52.167668 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15485998-b208-475d-b264-84c43fa3ae5d" path="/var/lib/kubelet/pods/15485998-b208-475d-b264-84c43fa3ae5d/volumes" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.277832 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.279379 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.281275 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.298390 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390118 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390161 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390192 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390406 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q4j\" (UniqueName: \"kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390522 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.390594 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.410208 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.483930 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.485554 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.489567 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.491764 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.491839 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.491868 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.492691 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.492766 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q4j\" (UniqueName: \"kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.492869 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.494378 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.496175 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.496309 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.496510 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.505009 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.512960 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.524556 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q4j\" (UniqueName: \"kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j\") pod \"ovn-controller-metrics-ld7dp\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.587217 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.594786 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.594829 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.594848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.594888 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfbr\" (UniqueName: \"kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.609790 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.651912 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.653383 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.657527 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.664970 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696473 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696564 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn48w\" (UniqueName: \"kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696620 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696670 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696701 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696724 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696794 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.696837 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfbr\" (UniqueName: \"kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.697099 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.698042 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.698557 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.704283 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.722406 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfbr\" (UniqueName: \"kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr\") pod \"dnsmasq-dns-5bf47b49b7-qmk6j\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.798325 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.798436 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.798462 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn48w\" (UniqueName: \"kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.798493 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.798530 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.799496 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.799903 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.799924 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.800591 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.802595 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.816176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn48w\" (UniqueName: \"kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w\") pod \"dnsmasq-dns-8554648995-c4mh9\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.865862 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.899401 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdwcp\" (UniqueName: \"kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp\") pod \"3469a150-cd5c-4ef0-8eea-b9803a3175af\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.899499 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc\") pod \"3469a150-cd5c-4ef0-8eea-b9803a3175af\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.899565 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config\") pod \"3469a150-cd5c-4ef0-8eea-b9803a3175af\" (UID: \"3469a150-cd5c-4ef0-8eea-b9803a3175af\") " Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.900301 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config" (OuterVolumeSpecName: "config") pod "3469a150-cd5c-4ef0-8eea-b9803a3175af" (UID: "3469a150-cd5c-4ef0-8eea-b9803a3175af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.900369 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3469a150-cd5c-4ef0-8eea-b9803a3175af" (UID: "3469a150-cd5c-4ef0-8eea-b9803a3175af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.906650 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp" (OuterVolumeSpecName: "kube-api-access-qdwcp") pod "3469a150-cd5c-4ef0-8eea-b9803a3175af" (UID: "3469a150-cd5c-4ef0-8eea-b9803a3175af"). InnerVolumeSpecName "kube-api-access-qdwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.981765 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:55 crc kubenswrapper[4929]: I1002 11:28:55.989291 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.000885 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config\") pod \"4132f319-3a7f-40df-a4bb-ffc9a51908da\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.000942 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdckm\" (UniqueName: \"kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm\") pod \"4132f319-3a7f-40df-a4bb-ffc9a51908da\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.000985 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc\") pod \"4132f319-3a7f-40df-a4bb-ffc9a51908da\" (UID: \"4132f319-3a7f-40df-a4bb-ffc9a51908da\") " Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.001374 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.001387 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdwcp\" (UniqueName: \"kubernetes.io/projected/3469a150-cd5c-4ef0-8eea-b9803a3175af-kube-api-access-qdwcp\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.001397 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3469a150-cd5c-4ef0-8eea-b9803a3175af-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.002084 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4132f319-3a7f-40df-a4bb-ffc9a51908da" (UID: "4132f319-3a7f-40df-a4bb-ffc9a51908da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.005173 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm" (OuterVolumeSpecName: "kube-api-access-vdckm") pod "4132f319-3a7f-40df-a4bb-ffc9a51908da" (UID: "4132f319-3a7f-40df-a4bb-ffc9a51908da"). InnerVolumeSpecName "kube-api-access-vdckm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.044465 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config" (OuterVolumeSpecName: "config") pod "4132f319-3a7f-40df-a4bb-ffc9a51908da" (UID: "4132f319-3a7f-40df-a4bb-ffc9a51908da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.103392 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.103422 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdckm\" (UniqueName: \"kubernetes.io/projected/4132f319-3a7f-40df-a4bb-ffc9a51908da-kube-api-access-vdckm\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.103431 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132f319-3a7f-40df-a4bb-ffc9a51908da-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.175880 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.336651 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:28:56 crc kubenswrapper[4929]: W1002 11:28:56.341045 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf960fb69_8ee1_4711_b685_eb8dd09a393f.slice/crio-bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b WatchSource:0}: Error finding container bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b: Status 404 returned error can't find the container with id bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.482369 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.565310 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" event={"ID":"3469a150-cd5c-4ef0-8eea-b9803a3175af","Type":"ContainerDied","Data":"a64860d00add1d34029f4850fe6c9cf7bfd704bd769e89df732b3db5854f0b4e"} Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.565344 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8skmc" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.566559 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" event={"ID":"f960fb69-8ee1-4711-b685-eb8dd09a393f","Type":"ContainerStarted","Data":"bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b"} Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.567526 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ld7dp" event={"ID":"49d60065-8bbd-4182-be31-c0f851790792","Type":"ContainerStarted","Data":"e08e1c3c89fd6abea1f07639ead7380e935d6a53a8c6b44a0f36153d671d3f11"} Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.568514 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" event={"ID":"4132f319-3a7f-40df-a4bb-ffc9a51908da","Type":"ContainerDied","Data":"2b32e1828b702930a4f376a180753c09494b75681d07abb15d65a40b7d966fbc"} Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.568571 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-z6qpv" Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.604014 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.613532 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8skmc"] Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.641413 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:56 crc kubenswrapper[4929]: I1002 11:28:56.650607 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-z6qpv"] Oct 02 11:28:57 crc kubenswrapper[4929]: W1002 11:28:57.786343 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e81b61_3c27_40f2_8418_eb7fa38d00a5.slice/crio-71780584a2d70664e45a600bf3c893ed2b312e59bfbbab9be360c87d613a6ed4 WatchSource:0}: Error finding container 71780584a2d70664e45a600bf3c893ed2b312e59bfbbab9be360c87d613a6ed4: Status 404 returned error can't find the container with id 71780584a2d70664e45a600bf3c893ed2b312e59bfbbab9be360c87d613a6ed4 Oct 02 11:28:58 crc kubenswrapper[4929]: I1002 11:28:58.175965 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3469a150-cd5c-4ef0-8eea-b9803a3175af" path="/var/lib/kubelet/pods/3469a150-cd5c-4ef0-8eea-b9803a3175af/volumes" Oct 02 11:28:58 crc kubenswrapper[4929]: I1002 11:28:58.176369 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4132f319-3a7f-40df-a4bb-ffc9a51908da" path="/var/lib/kubelet/pods/4132f319-3a7f-40df-a4bb-ffc9a51908da/volumes" Oct 02 11:28:58 crc kubenswrapper[4929]: I1002 11:28:58.593196 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerStarted","Data":"71780584a2d70664e45a600bf3c893ed2b312e59bfbbab9be360c87d613a6ed4"} Oct 02 11:28:59 crc kubenswrapper[4929]: I1002 11:28:59.600906 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerStarted","Data":"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919"} Oct 02 11:28:59 crc kubenswrapper[4929]: I1002 11:28:59.606042 4929 generic.go:334] "Generic (PLEG): container finished" podID="978200e0-025d-4000-baed-4ba85bf83c60" containerID="9b31c710f5e16531b1e61137b047da65ed86c42222822c83a63e2d292b03a7f8" exitCode=0 Oct 02 11:28:59 crc kubenswrapper[4929]: I1002 11:28:59.606284 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerDied","Data":"9b31c710f5e16531b1e61137b047da65ed86c42222822c83a63e2d292b03a7f8"} Oct 02 11:29:00 crc kubenswrapper[4929]: I1002 11:29:00.616391 4929 generic.go:334] "Generic (PLEG): container finished" podID="b07c8ee2-5443-410c-b2ab-b48699694626" containerID="0af63ab6c39474b0cbbc4d5e79bcc89441bf8fd2cd3ab19fdedfebef27a37122" exitCode=0 Oct 02 11:29:00 crc kubenswrapper[4929]: I1002 11:29:00.616503 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerDied","Data":"0af63ab6c39474b0cbbc4d5e79bcc89441bf8fd2cd3ab19fdedfebef27a37122"} Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.651862 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56c5fe9e-033d-4c3b-a71f-e2c215add4c5","Type":"ContainerStarted","Data":"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c"} Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.652451 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.653721 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerStarted","Data":"feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d"} Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.655546 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerStarted","Data":"2b859fd219d68a03c833c80a4486933cb925eae152050cfa50df66277e417160"} Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.667580 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.131582762 podStartE2EDuration="34.66756268s" podCreationTimestamp="2025-10-02 11:28:31 +0000 UTC" firstStartedPulling="2025-10-02 11:28:49.872227626 +0000 UTC m=+1130.422593990" lastFinishedPulling="2025-10-02 11:28:58.408207534 +0000 UTC m=+1138.958573908" observedRunningTime="2025-10-02 11:29:05.665133694 +0000 UTC m=+1146.215500058" watchObservedRunningTime="2025-10-02 11:29:05.66756268 +0000 UTC m=+1146.217929054" Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.686256 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.686237858 podStartE2EDuration="35.686237858s" podCreationTimestamp="2025-10-02 11:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:05.682112516 +0000 UTC m=+1146.232478880" watchObservedRunningTime="2025-10-02 11:29:05.686237858 +0000 UTC m=+1146.236604222" Oct 02 11:29:05 crc kubenswrapper[4929]: I1002 11:29:05.709130 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.24619301 podStartE2EDuration="36.7091146s" podCreationTimestamp="2025-10-02 11:28:29 +0000 UTC" firstStartedPulling="2025-10-02 11:28:31.809733654 +0000 UTC m=+1112.360100018" lastFinishedPulling="2025-10-02 11:28:49.272655244 +0000 UTC m=+1129.823021608" observedRunningTime="2025-10-02 11:29:05.7006445 +0000 UTC m=+1146.251010874" watchObservedRunningTime="2025-10-02 11:29:05.7091146 +0000 UTC m=+1146.259480964" Oct 02 11:29:06 crc kubenswrapper[4929]: I1002 11:29:06.680876 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerStarted","Data":"a2b3c63cd957d0d6465546090e7df57ed9c4c41564f42883287a45d18e14c8e9"} Oct 02 11:29:06 crc kubenswrapper[4929]: I1002 11:29:06.684257 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerStarted","Data":"45099a9a81d331acf15cd5d4cf4ab34cdedd4a4c511ece106065205a559fb3ec"} Oct 02 11:29:06 crc kubenswrapper[4929]: I1002 11:29:06.689915 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerStarted","Data":"0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773"} Oct 02 11:29:06 crc kubenswrapper[4929]: I1002 11:29:06.692404 4929 generic.go:334] "Generic (PLEG): container finished" podID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerID="b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c" exitCode=0 Oct 02 11:29:06 crc kubenswrapper[4929]: I1002 11:29:06.692542 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" event={"ID":"f960fb69-8ee1-4711-b685-eb8dd09a393f","Type":"ContainerDied","Data":"b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.702306 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ld7dp" event={"ID":"49d60065-8bbd-4182-be31-c0f851790792","Type":"ContainerStarted","Data":"0d5be4bb5d6960bc1b5676ae7124167aab77ea28d2f3f417b7410eab7da60d97"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.704751 4929 generic.go:334] "Generic (PLEG): container finished" podID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerID="a2b3c63cd957d0d6465546090e7df57ed9c4c41564f42883287a45d18e14c8e9" exitCode=0 Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.704833 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerDied","Data":"a2b3c63cd957d0d6465546090e7df57ed9c4c41564f42883287a45d18e14c8e9"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.704856 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerStarted","Data":"4d4355c4dc4d2e4f00a14252d9737d8f37882243df924d832eac53e1c23cdf47"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.705141 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.706774 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz" event={"ID":"752197b6-8008-4699-895b-4cbf3d475e96","Type":"ContainerStarted","Data":"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.706834 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8kqgz" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.709445 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerStarted","Data":"b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.712382 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerStarted","Data":"31ba20b248cec7058dba917dcf36d5e4fb82da3a9e74b7f3c12e428f1868b6d2"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.714269 4929 generic.go:334] "Generic (PLEG): container finished" podID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerID="5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a" exitCode=0 Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.714342 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerDied","Data":"5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.716416 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469da009-8740-4581-90b5-1e99b80a7f81","Type":"ContainerStarted","Data":"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.716560 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.719111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" event={"ID":"f960fb69-8ee1-4711-b685-eb8dd09a393f","Type":"ContainerStarted","Data":"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559"} Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.719290 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.745802 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ld7dp" podStartSLOduration=2.6180292080000003 podStartE2EDuration="12.745707858s" podCreationTimestamp="2025-10-02 11:28:55 +0000 UTC" firstStartedPulling="2025-10-02 11:28:56.189691838 +0000 UTC m=+1136.740058202" lastFinishedPulling="2025-10-02 11:29:06.317370488 +0000 UTC m=+1146.867736852" observedRunningTime="2025-10-02 11:29:07.722020903 +0000 UTC m=+1148.272387277" watchObservedRunningTime="2025-10-02 11:29:07.745707858 +0000 UTC m=+1148.296074222" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.755852 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.738596143 podStartE2EDuration="30.755832513s" podCreationTimestamp="2025-10-02 11:28:37 +0000 UTC" firstStartedPulling="2025-10-02 11:28:51.000029338 +0000 UTC m=+1131.550395712" lastFinishedPulling="2025-10-02 11:29:05.017265718 +0000 UTC m=+1145.567632082" observedRunningTime="2025-10-02 11:29:07.74690818 +0000 UTC m=+1148.297274544" watchObservedRunningTime="2025-10-02 11:29:07.755832513 +0000 UTC m=+1148.306198887" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.807356 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.009759784 podStartE2EDuration="28.807340204s" podCreationTimestamp="2025-10-02 11:28:39 +0000 UTC" firstStartedPulling="2025-10-02 11:28:50.311163377 +0000 UTC m=+1130.861529741" lastFinishedPulling="2025-10-02 11:29:05.108743797 +0000 UTC m=+1145.659110161" observedRunningTime="2025-10-02 11:29:07.794105974 +0000 UTC m=+1148.344472338" watchObservedRunningTime="2025-10-02 11:29:07.807340204 +0000 UTC m=+1148.357706558" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.844721 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-c4mh9" podStartSLOduration=5.409226043 podStartE2EDuration="12.844698361s" podCreationTimestamp="2025-10-02 11:28:55 +0000 UTC" firstStartedPulling="2025-10-02 11:28:57.787885607 +0000 UTC m=+1138.338251971" lastFinishedPulling="2025-10-02 11:29:05.223357925 +0000 UTC m=+1145.773724289" observedRunningTime="2025-10-02 11:29:07.833755063 +0000 UTC m=+1148.384121427" watchObservedRunningTime="2025-10-02 11:29:07.844698361 +0000 UTC m=+1148.395064725" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.859168 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" podStartSLOduration=8.728828426 podStartE2EDuration="12.859145934s" podCreationTimestamp="2025-10-02 11:28:55 +0000 UTC" firstStartedPulling="2025-10-02 11:28:56.355859639 +0000 UTC m=+1136.906226003" lastFinishedPulling="2025-10-02 11:29:00.486177147 +0000 UTC m=+1141.036543511" observedRunningTime="2025-10-02 11:29:07.854171789 +0000 UTC m=+1148.404538153" watchObservedRunningTime="2025-10-02 11:29:07.859145934 +0000 UTC m=+1148.409512298" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.890274 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8kqgz" podStartSLOduration=15.358946385 podStartE2EDuration="29.890228549s" podCreationTimestamp="2025-10-02 11:28:38 +0000 UTC" firstStartedPulling="2025-10-02 11:28:50.298068931 +0000 UTC m=+1130.848435315" lastFinishedPulling="2025-10-02 11:29:04.829351115 +0000 UTC m=+1145.379717479" observedRunningTime="2025-10-02 11:29:07.878699676 +0000 UTC m=+1148.429066040" watchObservedRunningTime="2025-10-02 11:29:07.890228549 +0000 UTC m=+1148.440594933" Oct 02 11:29:07 crc kubenswrapper[4929]: I1002 11:29:07.898298 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.477977342 podStartE2EDuration="34.898282789s" podCreationTimestamp="2025-10-02 11:28:33 +0000 UTC" firstStartedPulling="2025-10-02 11:28:50.000033623 +0000 UTC m=+1130.550399987" lastFinishedPulling="2025-10-02 11:29:06.42033907 +0000 UTC m=+1146.970705434" observedRunningTime="2025-10-02 11:29:07.896347036 +0000 UTC m=+1148.446713390" watchObservedRunningTime="2025-10-02 11:29:07.898282789 +0000 UTC m=+1148.448649143" Oct 02 11:29:07 crc kubenswrapper[4929]: E1002 11:29:07.961036 4929 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.173:51782->38.102.83.173:39349: read tcp 38.102.83.173:51782->38.102.83.173:39349: read: connection reset by peer Oct 02 11:29:08 crc kubenswrapper[4929]: I1002 11:29:08.730238 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerStarted","Data":"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b"} Oct 02 11:29:08 crc kubenswrapper[4929]: I1002 11:29:08.731221 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerStarted","Data":"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0"} Oct 02 11:29:08 crc kubenswrapper[4929]: I1002 11:29:08.752123 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fv8ff" podStartSLOduration=16.107189191 podStartE2EDuration="30.752100237s" podCreationTimestamp="2025-10-02 11:28:38 +0000 UTC" firstStartedPulling="2025-10-02 11:28:50.078281371 +0000 UTC m=+1130.628647745" lastFinishedPulling="2025-10-02 11:29:04.723192427 +0000 UTC m=+1145.273558791" observedRunningTime="2025-10-02 11:29:08.746263958 +0000 UTC m=+1149.296630332" watchObservedRunningTime="2025-10-02 11:29:08.752100237 +0000 UTC m=+1149.302466601" Oct 02 11:29:09 crc kubenswrapper[4929]: I1002 11:29:09.085247 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 11:29:09 crc kubenswrapper[4929]: I1002 11:29:09.085289 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 11:29:09 crc kubenswrapper[4929]: I1002 11:29:09.122892 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 11:29:09 crc kubenswrapper[4929]: I1002 11:29:09.737465 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:29:09 crc kubenswrapper[4929]: I1002 11:29:09.737843 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:29:10 crc kubenswrapper[4929]: I1002 11:29:10.667164 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 11:29:10 crc kubenswrapper[4929]: I1002 11:29:10.667207 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 11:29:10 crc kubenswrapper[4929]: I1002 11:29:10.702346 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 11:29:11 crc kubenswrapper[4929]: I1002 11:29:11.332771 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 11:29:11 crc kubenswrapper[4929]: I1002 11:29:11.332856 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 11:29:11 crc kubenswrapper[4929]: I1002 11:29:11.392850 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 11:29:11 crc kubenswrapper[4929]: I1002 11:29:11.799361 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.168715 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.179322 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hl5hq"] Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.180540 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.187360 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hl5hq"] Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.266282 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtlh\" (UniqueName: \"kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh\") pod \"placement-db-create-hl5hq\" (UID: \"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5\") " pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.320305 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.320351 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.367696 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtlh\" (UniqueName: \"kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh\") pod \"placement-db-create-hl5hq\" (UID: \"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5\") " pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.369223 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.399669 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtlh\" (UniqueName: \"kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh\") pod \"placement-db-create-hl5hq\" (UID: \"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5\") " pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.409166 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cmqkl"] Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.410071 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.414004 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cmqkl"] Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.565556 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.571102 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25zz\" (UniqueName: \"kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz\") pod \"glance-db-create-cmqkl\" (UID: \"35ae93dc-6dea-4651-b80a-896bf744ed05\") " pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.672583 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25zz\" (UniqueName: \"kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz\") pod \"glance-db-create-cmqkl\" (UID: \"35ae93dc-6dea-4651-b80a-896bf744ed05\") " pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.705000 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25zz\" (UniqueName: \"kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz\") pod \"glance-db-create-cmqkl\" (UID: \"35ae93dc-6dea-4651-b80a-896bf744ed05\") " pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.741693 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:12 crc kubenswrapper[4929]: I1002 11:29:12.818310 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.035540 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hl5hq"] Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.080146 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cmqkl"] Oct 02 11:29:13 crc kubenswrapper[4929]: W1002 11:29:13.096824 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c9b80a9_e1cb_4492_ad9b_ff8b771fcce5.slice/crio-27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b WatchSource:0}: Error finding container 27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b: Status 404 returned error can't find the container with id 27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.768831 4929 generic.go:334] "Generic (PLEG): container finished" podID="6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" containerID="7384489077dc8cd3293840bae9aadd9e62441ed7dbde530c3d8c89e74b1b0110" exitCode=0 Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.768914 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hl5hq" event={"ID":"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5","Type":"ContainerDied","Data":"7384489077dc8cd3293840bae9aadd9e62441ed7dbde530c3d8c89e74b1b0110"} Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.768942 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hl5hq" event={"ID":"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5","Type":"ContainerStarted","Data":"27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b"} Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.772721 4929 generic.go:334] "Generic (PLEG): container finished" podID="35ae93dc-6dea-4651-b80a-896bf744ed05" containerID="34777fd28d6f0a388c6451ee50aae23f26532ca2ddbc18c61ad44db7497a0ee7" exitCode=0 Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.772803 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cmqkl" event={"ID":"35ae93dc-6dea-4651-b80a-896bf744ed05","Type":"ContainerDied","Data":"34777fd28d6f0a388c6451ee50aae23f26532ca2ddbc18c61ad44db7497a0ee7"} Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.773103 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cmqkl" event={"ID":"35ae93dc-6dea-4651-b80a-896bf744ed05","Type":"ContainerStarted","Data":"f2e6324766ae5e52c8aa68de995f3025b39a225cd2fe41b7a714d0b55950b148"} Oct 02 11:29:13 crc kubenswrapper[4929]: I1002 11:29:13.933444 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.019012 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.019292 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="dnsmasq-dns" containerID="cri-o://fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559" gracePeriod=10 Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.020073 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.055355 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.057015 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.082537 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.182925 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.216216 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.216329 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.216385 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psplz\" (UniqueName: \"kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.216417 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.216693 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.318515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.318587 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.318659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.318698 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psplz\" (UniqueName: \"kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.318743 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.321192 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.321323 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.321567 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.321724 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.359099 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psplz\" (UniqueName: \"kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz\") pod \"dnsmasq-dns-b8fbc5445-v89pl\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.388511 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.520572 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.630381 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb\") pod \"f960fb69-8ee1-4711-b685-eb8dd09a393f\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.630549 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlfbr\" (UniqueName: \"kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr\") pod \"f960fb69-8ee1-4711-b685-eb8dd09a393f\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.630641 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config\") pod \"f960fb69-8ee1-4711-b685-eb8dd09a393f\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.630725 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc\") pod \"f960fb69-8ee1-4711-b685-eb8dd09a393f\" (UID: \"f960fb69-8ee1-4711-b685-eb8dd09a393f\") " Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.636700 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr" (OuterVolumeSpecName: "kube-api-access-jlfbr") pod "f960fb69-8ee1-4711-b685-eb8dd09a393f" (UID: "f960fb69-8ee1-4711-b685-eb8dd09a393f"). InnerVolumeSpecName "kube-api-access-jlfbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.681240 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f960fb69-8ee1-4711-b685-eb8dd09a393f" (UID: "f960fb69-8ee1-4711-b685-eb8dd09a393f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.696719 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f960fb69-8ee1-4711-b685-eb8dd09a393f" (UID: "f960fb69-8ee1-4711-b685-eb8dd09a393f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.709586 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config" (OuterVolumeSpecName: "config") pod "f960fb69-8ee1-4711-b685-eb8dd09a393f" (UID: "f960fb69-8ee1-4711-b685-eb8dd09a393f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.733107 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.733145 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlfbr\" (UniqueName: \"kubernetes.io/projected/f960fb69-8ee1-4711-b685-eb8dd09a393f-kube-api-access-jlfbr\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.733157 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.733168 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f960fb69-8ee1-4711-b685-eb8dd09a393f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.790495 4929 generic.go:334] "Generic (PLEG): container finished" podID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerID="fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559" exitCode=0 Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.790573 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.790625 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" event={"ID":"f960fb69-8ee1-4711-b685-eb8dd09a393f","Type":"ContainerDied","Data":"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559"} Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.790669 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-qmk6j" event={"ID":"f960fb69-8ee1-4711-b685-eb8dd09a393f","Type":"ContainerDied","Data":"bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b"} Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.790689 4929 scope.go:117] "RemoveContainer" containerID="fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.833715 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.837078 4929 scope.go:117] "RemoveContainer" containerID="b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.840743 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-qmk6j"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.854621 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.866107 4929 scope.go:117] "RemoveContainer" containerID="fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559" Oct 02 11:29:14 crc kubenswrapper[4929]: E1002 11:29:14.867135 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559\": container with ID starting with fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559 not found: ID does not exist" containerID="fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.867178 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559"} err="failed to get container status \"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559\": rpc error: code = NotFound desc = could not find container \"fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559\": container with ID starting with fa63b504e946621b972c7ee8033a6a15f7919fd9ba6758ff836cd5b492934559 not found: ID does not exist" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.867204 4929 scope.go:117] "RemoveContainer" containerID="b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c" Oct 02 11:29:14 crc kubenswrapper[4929]: E1002 11:29:14.867602 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c\": container with ID starting with b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c not found: ID does not exist" containerID="b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c" Oct 02 11:29:14 crc kubenswrapper[4929]: I1002 11:29:14.867632 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c"} err="failed to get container status \"b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c\": rpc error: code = NotFound desc = could not find container \"b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c\": container with ID starting with b65b4b0b73de2235b0514cc3950fbb39d3e96c5423187a4e2c27b345da85597c not found: ID does not exist" Oct 02 11:29:14 crc kubenswrapper[4929]: W1002 11:29:14.872171 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b19762_0589_4167_8a91_b0ccd22fefdf.slice/crio-1263ae6459bc03d5e1eb29a78c1a8da170c4bea203f23764ef6436d116e7ec86 WatchSource:0}: Error finding container 1263ae6459bc03d5e1eb29a78c1a8da170c4bea203f23764ef6436d116e7ec86: Status 404 returned error can't find the container with id 1263ae6459bc03d5e1eb29a78c1a8da170c4bea203f23764ef6436d116e7ec86 Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.027009 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf960fb69_8ee1_4711_b685_eb8dd09a393f.slice/crio-bf3f0b6b29e3047e7c1b81a1462e9b081062cc5a7f8c6e23016a51d441c12f1b\": RecentStats: unable to find data in memory cache]" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.166834 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.171952 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.172106 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.173191 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="dnsmasq-dns" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173217 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="dnsmasq-dns" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.173245 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173254 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.173276 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ae93dc-6dea-4651-b80a-896bf744ed05" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173284 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ae93dc-6dea-4651-b80a-896bf744ed05" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.173299 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="init" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173307 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="init" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173535 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173568 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ae93dc-6dea-4651-b80a-896bf744ed05" containerName="mariadb-database-create" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.173584 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" containerName="dnsmasq-dns" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.210186 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.210314 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.212762 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.213186 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-klbt7" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.213543 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.218274 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.241533 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmtlh\" (UniqueName: \"kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh\") pod \"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5\" (UID: \"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5\") " Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.246825 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh" (OuterVolumeSpecName: "kube-api-access-vmtlh") pod "6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" (UID: "6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5"). InnerVolumeSpecName "kube-api-access-vmtlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.343275 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b25zz\" (UniqueName: \"kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz\") pod \"35ae93dc-6dea-4651-b80a-896bf744ed05\" (UID: \"35ae93dc-6dea-4651-b80a-896bf744ed05\") " Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.343801 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.344101 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.344237 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.344440 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh64w\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.344568 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.344746 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmtlh\" (UniqueName: \"kubernetes.io/projected/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5-kube-api-access-vmtlh\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.348149 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz" (OuterVolumeSpecName: "kube-api-access-b25zz") pod "35ae93dc-6dea-4651-b80a-896bf744ed05" (UID: "35ae93dc-6dea-4651-b80a-896bf744ed05"). InnerVolumeSpecName "kube-api-access-b25zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446480 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446574 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446605 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446633 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh64w\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446665 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446706 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b25zz\" (UniqueName: \"kubernetes.io/projected/35ae93dc-6dea-4651-b80a-896bf744ed05-kube-api-access-b25zz\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.446797 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.446907 4929 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.446937 4929 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.447000 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift podName:4fca7cc0-4347-4fb0-99a2-5bdef9efd204 nodeName:}" failed. No retries permitted until 2025-10-02 11:29:15.946982995 +0000 UTC m=+1156.497349359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift") pod "swift-storage-0" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204") : configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.447088 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.447715 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.466230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.474893 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh64w\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.704550 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.798343 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hl5hq" event={"ID":"6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5","Type":"ContainerDied","Data":"27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b"} Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.798393 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27ce62ab3fa23069ffc75f21b549e5ae55a9b956df5f7286efaa2529b4ffb15b" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.798400 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hl5hq" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.800030 4929 generic.go:334] "Generic (PLEG): container finished" podID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerID="dedfaab2c2b9cc2cd612bc15f496e7f441b45b7f698ec0232ceba814f30d469e" exitCode=0 Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.800091 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" event={"ID":"37b19762-0589-4167-8a91-b0ccd22fefdf","Type":"ContainerDied","Data":"dedfaab2c2b9cc2cd612bc15f496e7f441b45b7f698ec0232ceba814f30d469e"} Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.800118 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" event={"ID":"37b19762-0589-4167-8a91-b0ccd22fefdf","Type":"ContainerStarted","Data":"1263ae6459bc03d5e1eb29a78c1a8da170c4bea203f23764ef6436d116e7ec86"} Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.803225 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cmqkl" event={"ID":"35ae93dc-6dea-4651-b80a-896bf744ed05","Type":"ContainerDied","Data":"f2e6324766ae5e52c8aa68de995f3025b39a225cd2fe41b7a714d0b55950b148"} Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.803260 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cmqkl" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.803268 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e6324766ae5e52c8aa68de995f3025b39a225cd2fe41b7a714d0b55950b148" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.947814 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.950724 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.954062 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.954098 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.954131 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.954167 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.954271 4929 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.954285 4929 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: E1002 11:29:15.954323 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift podName:4fca7cc0-4347-4fb0-99a2-5bdef9efd204 nodeName:}" failed. No retries permitted until 2025-10-02 11:29:16.954308687 +0000 UTC m=+1157.504675041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift") pod "swift-storage-0" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204") : configmap "swift-ring-files" not found Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.954449 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gpbmt" Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.957198 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:29:15 crc kubenswrapper[4929]: I1002 11:29:15.994016 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055693 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055738 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055796 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055811 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055832 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055907 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.055952 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psfwp\" (UniqueName: \"kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.156864 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psfwp\" (UniqueName: \"kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.156910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.156936 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.157079 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.157104 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.157128 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.157227 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.158359 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.161029 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.161397 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.162041 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.162267 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.162525 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.176461 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psfwp\" (UniqueName: \"kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp\") pod \"ovn-northd-0\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.179633 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f960fb69-8ee1-4711-b685-eb8dd09a393f" path="/var/lib/kubelet/pods/f960fb69-8ee1-4711-b685-eb8dd09a393f/volumes" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.272331 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.674601 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.819044 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerStarted","Data":"2c43719f35986168c0d7d320db2819d69012116b5e71c9255527e81b3f4584a1"} Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.820941 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" event={"ID":"37b19762-0589-4167-8a91-b0ccd22fefdf","Type":"ContainerStarted","Data":"f1c16475ebb3a9d3bb20a389c8c2b5741fb8a4caea85c6a0a5e5da18a1e2a31d"} Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.821112 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.840662 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" podStartSLOduration=2.840644261 podStartE2EDuration="2.840644261s" podCreationTimestamp="2025-10-02 11:29:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:16.837168387 +0000 UTC m=+1157.387534751" watchObservedRunningTime="2025-10-02 11:29:16.840644261 +0000 UTC m=+1157.391010625" Oct 02 11:29:16 crc kubenswrapper[4929]: I1002 11:29:16.969724 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:16 crc kubenswrapper[4929]: E1002 11:29:16.969945 4929 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:29:16 crc kubenswrapper[4929]: E1002 11:29:16.970002 4929 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:29:16 crc kubenswrapper[4929]: E1002 11:29:16.970079 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift podName:4fca7cc0-4347-4fb0-99a2-5bdef9efd204 nodeName:}" failed. No retries permitted until 2025-10-02 11:29:18.970051952 +0000 UTC m=+1159.520418326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift") pod "swift-storage-0" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204") : configmap "swift-ring-files" not found Oct 02 11:29:18 crc kubenswrapper[4929]: I1002 11:29:18.841259 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerStarted","Data":"c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a"} Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.007428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:19 crc kubenswrapper[4929]: E1002 11:29:19.007799 4929 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:29:19 crc kubenswrapper[4929]: E1002 11:29:19.008094 4929 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:29:19 crc kubenswrapper[4929]: E1002 11:29:19.008331 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift podName:4fca7cc0-4347-4fb0-99a2-5bdef9efd204 nodeName:}" failed. No retries permitted until 2025-10-02 11:29:23.008310394 +0000 UTC m=+1163.558676768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift") pod "swift-storage-0" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204") : configmap "swift-ring-files" not found Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.089644 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9tlgd"] Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.090978 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.094790 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.095001 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.095729 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.107077 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9tlgd"] Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211310 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211608 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211709 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211784 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-494wn\" (UniqueName: \"kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.211928 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.212081 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314316 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-494wn\" (UniqueName: \"kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314538 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314610 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314725 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314851 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314907 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.314997 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.315740 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.316137 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.316406 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.319229 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.319810 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.330581 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.339685 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-494wn\" (UniqueName: \"kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn\") pod \"swift-ring-rebalance-9tlgd\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.412537 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.814138 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9tlgd"] Oct 02 11:29:19 crc kubenswrapper[4929]: W1002 11:29:19.819721 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29989848_bd4f_4d93_a71f_95965ef153e8.slice/crio-98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d WatchSource:0}: Error finding container 98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d: Status 404 returned error can't find the container with id 98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.849495 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerStarted","Data":"a800d27ad9ba4905470d759a654c04cea37a9ca62559cf4a2feee8d6683bdd38"} Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.849645 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.850735 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9tlgd" event={"ID":"29989848-bd4f-4d93-a71f-95965ef153e8","Type":"ContainerStarted","Data":"98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d"} Oct 02 11:29:19 crc kubenswrapper[4929]: I1002 11:29:19.879070 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.665382144 podStartE2EDuration="4.879047303s" podCreationTimestamp="2025-10-02 11:29:15 +0000 UTC" firstStartedPulling="2025-10-02 11:29:16.680043562 +0000 UTC m=+1157.230409926" lastFinishedPulling="2025-10-02 11:29:17.893708721 +0000 UTC m=+1158.444075085" observedRunningTime="2025-10-02 11:29:19.866589474 +0000 UTC m=+1160.416955848" watchObservedRunningTime="2025-10-02 11:29:19.879047303 +0000 UTC m=+1160.429413667" Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.774807 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5frdp"] Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.776324 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.785018 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5frdp"] Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.857266 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56pl\" (UniqueName: \"kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl\") pod \"keystone-db-create-5frdp\" (UID: \"55506978-e694-48ad-be63-ad2179c36f0f\") " pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.959524 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56pl\" (UniqueName: \"kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl\") pod \"keystone-db-create-5frdp\" (UID: \"55506978-e694-48ad-be63-ad2179c36f0f\") " pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:21 crc kubenswrapper[4929]: I1002 11:29:21.982453 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56pl\" (UniqueName: \"kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl\") pod \"keystone-db-create-5frdp\" (UID: \"55506978-e694-48ad-be63-ad2179c36f0f\") " pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.115777 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.194071 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c11-account-create-8bfzg"] Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.195402 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.197373 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.200447 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c11-account-create-8bfzg"] Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.266844 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7llq5\" (UniqueName: \"kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5\") pod \"placement-5c11-account-create-8bfzg\" (UID: \"89926e8c-7fcf-4877-8be1-8bce650c2ca1\") " pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.368515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7llq5\" (UniqueName: \"kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5\") pod \"placement-5c11-account-create-8bfzg\" (UID: \"89926e8c-7fcf-4877-8be1-8bce650c2ca1\") " pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.385448 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7llq5\" (UniqueName: \"kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5\") pod \"placement-5c11-account-create-8bfzg\" (UID: \"89926e8c-7fcf-4877-8be1-8bce650c2ca1\") " pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.514446 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.531081 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a7fb-account-create-kq6p5"] Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.532293 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.534843 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.550452 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a7fb-account-create-kq6p5"] Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.675077 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj4xp\" (UniqueName: \"kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp\") pod \"glance-a7fb-account-create-kq6p5\" (UID: \"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a\") " pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.777339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj4xp\" (UniqueName: \"kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp\") pod \"glance-a7fb-account-create-kq6p5\" (UID: \"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a\") " pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.794994 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj4xp\" (UniqueName: \"kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp\") pod \"glance-a7fb-account-create-kq6p5\" (UID: \"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a\") " pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.848764 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.874279 4929 generic.go:334] "Generic (PLEG): container finished" podID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerID="dcb01c0ec91fa8b636cd159dd6d4fbe9815deb68d2051731a33d12b7eda329bb" exitCode=0 Oct 02 11:29:22 crc kubenswrapper[4929]: I1002 11:29:22.874333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerDied","Data":"dcb01c0ec91fa8b636cd159dd6d4fbe9815deb68d2051731a33d12b7eda329bb"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.085427 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:23 crc kubenswrapper[4929]: E1002 11:29:23.085642 4929 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:29:23 crc kubenswrapper[4929]: E1002 11:29:23.085673 4929 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:29:23 crc kubenswrapper[4929]: E1002 11:29:23.085746 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift podName:4fca7cc0-4347-4fb0-99a2-5bdef9efd204 nodeName:}" failed. No retries permitted until 2025-10-02 11:29:31.085722773 +0000 UTC m=+1171.636089147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift") pod "swift-storage-0" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204") : configmap "swift-ring-files" not found Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.745153 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c11-account-create-8bfzg"] Oct 02 11:29:23 crc kubenswrapper[4929]: W1002 11:29:23.749382 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89926e8c_7fcf_4877_8be1_8bce650c2ca1.slice/crio-585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea WatchSource:0}: Error finding container 585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea: Status 404 returned error can't find the container with id 585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.829735 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5frdp"] Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.835390 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a7fb-account-create-kq6p5"] Oct 02 11:29:23 crc kubenswrapper[4929]: W1002 11:29:23.836333 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55506978_e694_48ad_be63_ad2179c36f0f.slice/crio-71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4 WatchSource:0}: Error finding container 71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4: Status 404 returned error can't find the container with id 71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4 Oct 02 11:29:23 crc kubenswrapper[4929]: W1002 11:29:23.840806 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b4490c9_7acf_4c3e_96d9_e88b6d777b8a.slice/crio-1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884 WatchSource:0}: Error finding container 1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884: Status 404 returned error can't find the container with id 1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884 Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.885356 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9tlgd" event={"ID":"29989848-bd4f-4d93-a71f-95965ef153e8","Type":"ContainerStarted","Data":"96f765af27e40428cb700ca16b6c24d9da1ada736c4753ceb2817710484416f8"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.886498 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5frdp" event={"ID":"55506978-e694-48ad-be63-ad2179c36f0f","Type":"ContainerStarted","Data":"71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.887437 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c11-account-create-8bfzg" event={"ID":"89926e8c-7fcf-4877-8be1-8bce650c2ca1","Type":"ContainerStarted","Data":"585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.888569 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a7fb-account-create-kq6p5" event={"ID":"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a","Type":"ContainerStarted","Data":"1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.890602 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerStarted","Data":"1300e80581b8037301a49cd07f0c5f8de41330fcc719f6803e48273136aa7404"} Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.891289 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.912134 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9tlgd" podStartSLOduration=1.37774073 podStartE2EDuration="4.912116015s" podCreationTimestamp="2025-10-02 11:29:19 +0000 UTC" firstStartedPulling="2025-10-02 11:29:19.822236407 +0000 UTC m=+1160.372602771" lastFinishedPulling="2025-10-02 11:29:23.356611682 +0000 UTC m=+1163.906978056" observedRunningTime="2025-10-02 11:29:23.907266553 +0000 UTC m=+1164.457632917" watchObservedRunningTime="2025-10-02 11:29:23.912116015 +0000 UTC m=+1164.462482379" Oct 02 11:29:23 crc kubenswrapper[4929]: I1002 11:29:23.941458 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.408687751 podStartE2EDuration="56.941434983s" podCreationTimestamp="2025-10-02 11:28:27 +0000 UTC" firstStartedPulling="2025-10-02 11:28:29.73248207 +0000 UTC m=+1110.282848434" lastFinishedPulling="2025-10-02 11:28:49.265229312 +0000 UTC m=+1129.815595666" observedRunningTime="2025-10-02 11:29:23.934554356 +0000 UTC m=+1164.484920730" watchObservedRunningTime="2025-10-02 11:29:23.941434983 +0000 UTC m=+1164.491801367" Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.390735 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.461140 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.461707 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-c4mh9" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="dnsmasq-dns" containerID="cri-o://4d4355c4dc4d2e4f00a14252d9737d8f37882243df924d832eac53e1c23cdf47" gracePeriod=10 Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.899798 4929 generic.go:334] "Generic (PLEG): container finished" podID="55506978-e694-48ad-be63-ad2179c36f0f" containerID="5d9df53a4f7dd6bf6680f6dc938e2a767b61229af9d876f885f266a730cf0a8a" exitCode=0 Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.899867 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5frdp" event={"ID":"55506978-e694-48ad-be63-ad2179c36f0f","Type":"ContainerDied","Data":"5d9df53a4f7dd6bf6680f6dc938e2a767b61229af9d876f885f266a730cf0a8a"} Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.904227 4929 generic.go:334] "Generic (PLEG): container finished" podID="89926e8c-7fcf-4877-8be1-8bce650c2ca1" containerID="047acfe6971368179c8ba9cdbd0536c0529e14bdc217f4de7400dd6418593eae" exitCode=0 Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.904306 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c11-account-create-8bfzg" event={"ID":"89926e8c-7fcf-4877-8be1-8bce650c2ca1","Type":"ContainerDied","Data":"047acfe6971368179c8ba9cdbd0536c0529e14bdc217f4de7400dd6418593eae"} Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.906722 4929 generic.go:334] "Generic (PLEG): container finished" podID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerID="4d4355c4dc4d2e4f00a14252d9737d8f37882243df924d832eac53e1c23cdf47" exitCode=0 Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.906780 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerDied","Data":"4d4355c4dc4d2e4f00a14252d9737d8f37882243df924d832eac53e1c23cdf47"} Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.915571 4929 generic.go:334] "Generic (PLEG): container finished" podID="4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" containerID="4443b5280736a17dd9f32e8de85fda28418ccddac29c408f2aa471462e13392a" exitCode=0 Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.917757 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a7fb-account-create-kq6p5" event={"ID":"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a","Type":"ContainerDied","Data":"4443b5280736a17dd9f32e8de85fda28418ccddac29c408f2aa471462e13392a"} Oct 02 11:29:24 crc kubenswrapper[4929]: I1002 11:29:24.995392 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.122984 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb\") pod \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.123113 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc\") pod \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.123220 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb\") pod \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.123333 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config\") pod \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.123409 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn48w\" (UniqueName: \"kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w\") pod \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\" (UID: \"d8e81b61-3c27-40f2-8418-eb7fa38d00a5\") " Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.159686 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w" (OuterVolumeSpecName: "kube-api-access-fn48w") pod "d8e81b61-3c27-40f2-8418-eb7fa38d00a5" (UID: "d8e81b61-3c27-40f2-8418-eb7fa38d00a5"). InnerVolumeSpecName "kube-api-access-fn48w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.176820 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config" (OuterVolumeSpecName: "config") pod "d8e81b61-3c27-40f2-8418-eb7fa38d00a5" (UID: "d8e81b61-3c27-40f2-8418-eb7fa38d00a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.182411 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8e81b61-3c27-40f2-8418-eb7fa38d00a5" (UID: "d8e81b61-3c27-40f2-8418-eb7fa38d00a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.192310 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8e81b61-3c27-40f2-8418-eb7fa38d00a5" (UID: "d8e81b61-3c27-40f2-8418-eb7fa38d00a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.203305 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8e81b61-3c27-40f2-8418-eb7fa38d00a5" (UID: "d8e81b61-3c27-40f2-8418-eb7fa38d00a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.225640 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.225851 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.225988 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.226098 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.226195 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn48w\" (UniqueName: \"kubernetes.io/projected/d8e81b61-3c27-40f2-8418-eb7fa38d00a5-kube-api-access-fn48w\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.923988 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-c4mh9" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.923987 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-c4mh9" event={"ID":"d8e81b61-3c27-40f2-8418-eb7fa38d00a5","Type":"ContainerDied","Data":"71780584a2d70664e45a600bf3c893ed2b312e59bfbbab9be360c87d613a6ed4"} Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.926108 4929 scope.go:117] "RemoveContainer" containerID="4d4355c4dc4d2e4f00a14252d9737d8f37882243df924d832eac53e1c23cdf47" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.962937 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.966129 4929 scope.go:117] "RemoveContainer" containerID="a2b3c63cd957d0d6465546090e7df57ed9c4c41564f42883287a45d18e14c8e9" Oct 02 11:29:25 crc kubenswrapper[4929]: I1002 11:29:25.974515 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-c4mh9"] Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.171562 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" path="/var/lib/kubelet/pods/d8e81b61-3c27-40f2-8418-eb7fa38d00a5/volumes" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.332529 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.410629 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.423579 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.447145 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7llq5\" (UniqueName: \"kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5\") pod \"89926e8c-7fcf-4877-8be1-8bce650c2ca1\" (UID: \"89926e8c-7fcf-4877-8be1-8bce650c2ca1\") " Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.452387 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5" (OuterVolumeSpecName: "kube-api-access-7llq5") pod "89926e8c-7fcf-4877-8be1-8bce650c2ca1" (UID: "89926e8c-7fcf-4877-8be1-8bce650c2ca1"). InnerVolumeSpecName "kube-api-access-7llq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.549138 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj4xp\" (UniqueName: \"kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp\") pod \"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a\" (UID: \"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a\") " Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.549261 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56pl\" (UniqueName: \"kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl\") pod \"55506978-e694-48ad-be63-ad2179c36f0f\" (UID: \"55506978-e694-48ad-be63-ad2179c36f0f\") " Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.549654 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7llq5\" (UniqueName: \"kubernetes.io/projected/89926e8c-7fcf-4877-8be1-8bce650c2ca1-kube-api-access-7llq5\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.552239 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp" (OuterVolumeSpecName: "kube-api-access-cj4xp") pod "4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" (UID: "4b4490c9-7acf-4c3e-96d9-e88b6d777b8a"). InnerVolumeSpecName "kube-api-access-cj4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.552632 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl" (OuterVolumeSpecName: "kube-api-access-m56pl") pod "55506978-e694-48ad-be63-ad2179c36f0f" (UID: "55506978-e694-48ad-be63-ad2179c36f0f"). InnerVolumeSpecName "kube-api-access-m56pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.652223 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj4xp\" (UniqueName: \"kubernetes.io/projected/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a-kube-api-access-cj4xp\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.652256 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56pl\" (UniqueName: \"kubernetes.io/projected/55506978-e694-48ad-be63-ad2179c36f0f-kube-api-access-m56pl\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.933159 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a7fb-account-create-kq6p5" event={"ID":"4b4490c9-7acf-4c3e-96d9-e88b6d777b8a","Type":"ContainerDied","Data":"1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884"} Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.933186 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a7fb-account-create-kq6p5" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.933209 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bdb05a2335d11493c4daf6bd1e047dc9608efc7758dd3d76d95e62289304884" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.934999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5frdp" event={"ID":"55506978-e694-48ad-be63-ad2179c36f0f","Type":"ContainerDied","Data":"71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4"} Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.935018 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71379f81211b0fb7b23137e876edd83d57139caac86f6aa8227f21bf8b1aafa4" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.935030 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5frdp" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.936681 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c11-account-create-8bfzg" event={"ID":"89926e8c-7fcf-4877-8be1-8bce650c2ca1","Type":"ContainerDied","Data":"585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea"} Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.936722 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c11-account-create-8bfzg" Oct 02 11:29:26 crc kubenswrapper[4929]: I1002 11:29:26.936733 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="585473797acb85cff242095f8bb41c87817d0d8769d4b07f7ca9828aa9148fea" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730106 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6zpcs"] Oct 02 11:29:27 crc kubenswrapper[4929]: E1002 11:29:27.730423 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730438 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: E1002 11:29:27.730446 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89926e8c-7fcf-4877-8be1-8bce650c2ca1" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730453 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="89926e8c-7fcf-4877-8be1-8bce650c2ca1" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: E1002 11:29:27.730466 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="dnsmasq-dns" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730473 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="dnsmasq-dns" Oct 02 11:29:27 crc kubenswrapper[4929]: E1002 11:29:27.730482 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="init" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730488 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="init" Oct 02 11:29:27 crc kubenswrapper[4929]: E1002 11:29:27.730496 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55506978-e694-48ad-be63-ad2179c36f0f" containerName="mariadb-database-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730501 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55506978-e694-48ad-be63-ad2179c36f0f" containerName="mariadb-database-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730648 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e81b61-3c27-40f2-8418-eb7fa38d00a5" containerName="dnsmasq-dns" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730656 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="89926e8c-7fcf-4877-8be1-8bce650c2ca1" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730672 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="55506978-e694-48ad-be63-ad2179c36f0f" containerName="mariadb-database-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.730679 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" containerName="mariadb-account-create" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.731243 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.733183 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9kk67" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.733507 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.744549 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6zpcs"] Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.875777 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.875941 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.875990 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqn9p\" (UniqueName: \"kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.876066 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.977875 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.978085 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.978141 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqn9p\" (UniqueName: \"kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.978175 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.983788 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.984054 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.984033 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:27 crc kubenswrapper[4929]: I1002 11:29:27.997338 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqn9p\" (UniqueName: \"kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p\") pod \"glance-db-sync-6zpcs\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:28 crc kubenswrapper[4929]: I1002 11:29:28.047684 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:28 crc kubenswrapper[4929]: I1002 11:29:28.605742 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6zpcs"] Oct 02 11:29:28 crc kubenswrapper[4929]: I1002 11:29:28.951718 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6zpcs" event={"ID":"de08030d-c53a-49cc-906e-10b22cd577e1","Type":"ContainerStarted","Data":"cb0b4cba2cf64557042db7bdefe99d20453726e15ccc31b74740925669bbee4e"} Oct 02 11:29:30 crc kubenswrapper[4929]: I1002 11:29:30.968196 4929 generic.go:334] "Generic (PLEG): container finished" podID="29989848-bd4f-4d93-a71f-95965ef153e8" containerID="96f765af27e40428cb700ca16b6c24d9da1ada736c4753ceb2817710484416f8" exitCode=0 Oct 02 11:29:30 crc kubenswrapper[4929]: I1002 11:29:30.968281 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9tlgd" event={"ID":"29989848-bd4f-4d93-a71f-95965ef153e8","Type":"ContainerDied","Data":"96f765af27e40428cb700ca16b6c24d9da1ada736c4753ceb2817710484416f8"} Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.134988 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.148661 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"swift-storage-0\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " pod="openstack/swift-storage-0" Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.339366 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.433569 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.981025 4929 generic.go:334] "Generic (PLEG): container finished" podID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerID="d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919" exitCode=0 Oct 02 11:29:31 crc kubenswrapper[4929]: I1002 11:29:31.981111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerDied","Data":"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919"} Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.046394 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:29:32 crc kubenswrapper[4929]: W1002 11:29:32.068794 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fca7cc0_4347_4fb0_99a2_5bdef9efd204.slice/crio-ee15389ee8c608078322513cb85144936d51506916342c258b7db63e62bac1dd WatchSource:0}: Error finding container ee15389ee8c608078322513cb85144936d51506916342c258b7db63e62bac1dd: Status 404 returned error can't find the container with id ee15389ee8c608078322513cb85144936d51506916342c258b7db63e62bac1dd Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.307217 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.359912 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360345 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360402 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360495 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360612 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-494wn\" (UniqueName: \"kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360715 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.360872 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift\") pod \"29989848-bd4f-4d93-a71f-95965ef153e8\" (UID: \"29989848-bd4f-4d93-a71f-95965ef153e8\") " Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.361372 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.361562 4929 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.364624 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.365817 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn" (OuterVolumeSpecName: "kube-api-access-494wn") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "kube-api-access-494wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.374528 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.395191 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.395213 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.395634 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts" (OuterVolumeSpecName: "scripts") pod "29989848-bd4f-4d93-a71f-95965ef153e8" (UID: "29989848-bd4f-4d93-a71f-95965ef153e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.462734 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29989848-bd4f-4d93-a71f-95965ef153e8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.463011 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-494wn\" (UniqueName: \"kubernetes.io/projected/29989848-bd4f-4d93-a71f-95965ef153e8-kube-api-access-494wn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.463092 4929 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/29989848-bd4f-4d93-a71f-95965ef153e8-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.463151 4929 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.463204 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:32 crc kubenswrapper[4929]: I1002 11:29:32.463265 4929 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/29989848-bd4f-4d93-a71f-95965ef153e8-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.006766 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"ee15389ee8c608078322513cb85144936d51506916342c258b7db63e62bac1dd"} Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.009710 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9tlgd" event={"ID":"29989848-bd4f-4d93-a71f-95965ef153e8","Type":"ContainerDied","Data":"98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d"} Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.009735 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tlgd" Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.009750 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98278fb2c70ffbdb56e906808a4dee563bcb88c5f1e3adb034830762bec07c9d" Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.019586 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerStarted","Data":"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0"} Oct 02 11:29:33 crc kubenswrapper[4929]: I1002 11:29:33.020410 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:29:34 crc kubenswrapper[4929]: I1002 11:29:34.030448 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"908ca2fd69c5108943cc878199a4b51016780d034eeff8322db65f1600694e85"} Oct 02 11:29:34 crc kubenswrapper[4929]: I1002 11:29:34.030722 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"c422966ff4f7462ce3b91b168f5ac09e2c599380e12f670a2a65404aef3dd588"} Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.619071 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8kqgz" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:29:38 crc kubenswrapper[4929]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:29:38 crc kubenswrapper[4929]: > Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.636466 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.651146 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.663591 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371965.191206 podStartE2EDuration="1m11.663569766s" podCreationTimestamp="2025-10-02 11:28:27 +0000 UTC" firstStartedPulling="2025-10-02 11:28:29.279733623 +0000 UTC m=+1109.830099987" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:33.050140875 +0000 UTC m=+1173.600507249" watchObservedRunningTime="2025-10-02 11:29:38.663569766 +0000 UTC m=+1179.213936130" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.859678 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8kqgz-config-fgmp4"] Oct 02 11:29:38 crc kubenswrapper[4929]: E1002 11:29:38.860078 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29989848-bd4f-4d93-a71f-95965ef153e8" containerName="swift-ring-rebalance" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.860094 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="29989848-bd4f-4d93-a71f-95965ef153e8" containerName="swift-ring-rebalance" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.860257 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="29989848-bd4f-4d93-a71f-95965ef153e8" containerName="swift-ring-rebalance" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.860916 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.865552 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.879329 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz-config-fgmp4"] Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986736 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986810 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwrtj\" (UniqueName: \"kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986860 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986882 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986904 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:38 crc kubenswrapper[4929]: I1002 11:29:38.986978 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.092275 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093148 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093211 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwrtj\" (UniqueName: \"kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093282 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093307 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093329 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093625 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.093625 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.094360 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.098405 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.134464 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwrtj\" (UniqueName: \"kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj\") pod \"ovn-controller-8kqgz-config-fgmp4\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.179036 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:39 crc kubenswrapper[4929]: I1002 11:29:39.194131 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:29:40 crc kubenswrapper[4929]: I1002 11:29:40.316844 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz-config-fgmp4"] Oct 02 11:29:40 crc kubenswrapper[4929]: W1002 11:29:40.333675 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1231092_542d_4228_ac26_d661d6728d7c.slice/crio-5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681 WatchSource:0}: Error finding container 5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681: Status 404 returned error can't find the container with id 5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681 Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.094822 4929 generic.go:334] "Generic (PLEG): container finished" podID="b1231092-542d-4228-ac26-d661d6728d7c" containerID="ed962c1b5d656d4e1215fbf1c48196a168d154f5fc0d6e13903f2e7ae10877e1" exitCode=0 Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.095065 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-fgmp4" event={"ID":"b1231092-542d-4228-ac26-d661d6728d7c","Type":"ContainerDied","Data":"ed962c1b5d656d4e1215fbf1c48196a168d154f5fc0d6e13903f2e7ae10877e1"} Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.095129 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-fgmp4" event={"ID":"b1231092-542d-4228-ac26-d661d6728d7c","Type":"ContainerStarted","Data":"5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681"} Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.096498 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6zpcs" event={"ID":"de08030d-c53a-49cc-906e-10b22cd577e1","Type":"ContainerStarted","Data":"1746635e6e6644889048b899323774c25ad20cec9c641616c1b701921c91a205"} Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.098275 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"8035c6c8bfb09505e74f61334bb079defd5376ea17a1624b860ad93bb160b4a7"} Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.098301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"cefdca76aa5689375d6555f161f627d6003d8fac34c8df5144f6cacb5ac6866c"} Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.144275 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6zpcs" podStartSLOduration=2.781818998 podStartE2EDuration="14.144257724s" podCreationTimestamp="2025-10-02 11:29:27 +0000 UTC" firstStartedPulling="2025-10-02 11:29:28.615444769 +0000 UTC m=+1169.165811143" lastFinishedPulling="2025-10-02 11:29:39.977883505 +0000 UTC m=+1180.528249869" observedRunningTime="2025-10-02 11:29:41.14337912 +0000 UTC m=+1181.693745484" watchObservedRunningTime="2025-10-02 11:29:41.144257724 +0000 UTC m=+1181.694624088" Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.886213 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cda6-account-create-r9q8t"] Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.887890 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.891375 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.899132 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cda6-account-create-r9q8t"] Oct 02 11:29:41 crc kubenswrapper[4929]: I1002 11:29:41.945942 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj\") pod \"keystone-cda6-account-create-r9q8t\" (UID: \"40dec924-bb71-4ac6-ac46-d77725459692\") " pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.048424 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj\") pod \"keystone-cda6-account-create-r9q8t\" (UID: \"40dec924-bb71-4ac6-ac46-d77725459692\") " pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.067631 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj\") pod \"keystone-cda6-account-create-r9q8t\" (UID: \"40dec924-bb71-4ac6-ac46-d77725459692\") " pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.114925 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"d8d4bec396dfc299189ef1b9a62ea5ec2484a5fe0492556f6ab9d91d861c28eb"} Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.115202 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"b84dbe7e64f60d151dd9bf83d9d60d85c9c02ab74df4e15e5302c38e6a6cc41c"} Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.225513 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.373992 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464192 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464761 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464325 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run" (OuterVolumeSpecName: "var-run") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464821 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464873 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.464902 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.465072 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.465270 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwrtj\" (UniqueName: \"kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.465407 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts\") pod \"b1231092-542d-4228-ac26-d661d6728d7c\" (UID: \"b1231092-542d-4228-ac26-d661d6728d7c\") " Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466094 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466322 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts" (OuterVolumeSpecName: "scripts") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466351 4929 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466403 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466417 4929 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.466429 4929 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b1231092-542d-4228-ac26-d661d6728d7c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.473428 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj" (OuterVolumeSpecName: "kube-api-access-qwrtj") pod "b1231092-542d-4228-ac26-d661d6728d7c" (UID: "b1231092-542d-4228-ac26-d661d6728d7c"). InnerVolumeSpecName "kube-api-access-qwrtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.568037 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1231092-542d-4228-ac26-d661d6728d7c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.568066 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwrtj\" (UniqueName: \"kubernetes.io/projected/b1231092-542d-4228-ac26-d661d6728d7c-kube-api-access-qwrtj\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:42 crc kubenswrapper[4929]: I1002 11:29:42.748702 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cda6-account-create-r9q8t"] Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.125225 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cda6-account-create-r9q8t" event={"ID":"40dec924-bb71-4ac6-ac46-d77725459692","Type":"ContainerStarted","Data":"debb4348289999a8f3abaf03c4450711bafdb203a33eb5fe165ceb3841367bae"} Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.125559 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cda6-account-create-r9q8t" event={"ID":"40dec924-bb71-4ac6-ac46-d77725459692","Type":"ContainerStarted","Data":"270da1d93ab8fa33a709f08317fcb6de30c045f1e33b176ddf8961f43f2a9ab1"} Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.128304 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-fgmp4" event={"ID":"b1231092-542d-4228-ac26-d661d6728d7c","Type":"ContainerDied","Data":"5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681"} Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.128327 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5649cba96b4d80e0e601d2a092961df47039e89b62a99e315d685831d1661681" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.128364 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-fgmp4" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.141140 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"50576db641077b7a1e9dbf23a9fc5b7cda23206b43329a6994ae96a7b01bca1b"} Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.141178 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"7fc56f6f53a6276996cea4cb299fd663fe0652dc22f0c71496b40d33cbd4a999"} Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.193827 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cda6-account-create-r9q8t" podStartSLOduration=2.193811804 podStartE2EDuration="2.193811804s" podCreationTimestamp="2025-10-02 11:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:43.189739575 +0000 UTC m=+1183.740105939" watchObservedRunningTime="2025-10-02 11:29:43.193811804 +0000 UTC m=+1183.744178158" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.497408 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8kqgz-config-fgmp4"] Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.502428 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8kqgz-config-fgmp4"] Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.546274 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8kqgz-config-mc26p"] Oct 02 11:29:43 crc kubenswrapper[4929]: E1002 11:29:43.546685 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1231092-542d-4228-ac26-d661d6728d7c" containerName="ovn-config" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.546710 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1231092-542d-4228-ac26-d661d6728d7c" containerName="ovn-config" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.546933 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1231092-542d-4228-ac26-d661d6728d7c" containerName="ovn-config" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.547649 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.551029 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.555888 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz-config-mc26p"] Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.604747 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8kqgz" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693546 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693685 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qbd\" (UniqueName: \"kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693733 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693760 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693784 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.693809 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795185 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795308 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qbd\" (UniqueName: \"kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795360 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795399 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.795420 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.796091 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.796095 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.796283 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.796385 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.797520 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.815686 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qbd\" (UniqueName: \"kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd\") pod \"ovn-controller-8kqgz-config-mc26p\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:43 crc kubenswrapper[4929]: I1002 11:29:43.871929 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:44 crc kubenswrapper[4929]: I1002 11:29:44.153691 4929 generic.go:334] "Generic (PLEG): container finished" podID="40dec924-bb71-4ac6-ac46-d77725459692" containerID="debb4348289999a8f3abaf03c4450711bafdb203a33eb5fe165ceb3841367bae" exitCode=0 Oct 02 11:29:44 crc kubenswrapper[4929]: I1002 11:29:44.154083 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cda6-account-create-r9q8t" event={"ID":"40dec924-bb71-4ac6-ac46-d77725459692","Type":"ContainerDied","Data":"debb4348289999a8f3abaf03c4450711bafdb203a33eb5fe165ceb3841367bae"} Oct 02 11:29:44 crc kubenswrapper[4929]: I1002 11:29:44.174074 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1231092-542d-4228-ac26-d661d6728d7c" path="/var/lib/kubelet/pods/b1231092-542d-4228-ac26-d661d6728d7c/volumes" Oct 02 11:29:44 crc kubenswrapper[4929]: I1002 11:29:44.472475 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8kqgz-config-mc26p"] Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.166189 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-mc26p" event={"ID":"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa","Type":"ContainerStarted","Data":"563132632ec9af4b517dd0c98a94e892d44876a724b6c8146cd51c68809ccdb5"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.166435 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-mc26p" event={"ID":"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa","Type":"ContainerStarted","Data":"a127ae6d94a156555b7499ead552e88d144a114768c83e62a2f51963d596f66d"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.174001 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"e4b14c7b773820d673e84708764e3207f484e07e6581b05635251acbe436a01b"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.174045 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"8e7f1a184638b273c379d892f5706e40b2b0e6e8a03f5d40e8cf8e31bb64e072"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.174056 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"b60072b033a9f359be40981e533a243a7b09e82e55c795215b5c3ac05b529145"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.174065 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"4c94d3591c6c3e15abde4c9e9bda1a1d6451806b1b6b0c671dfee4007ae1a8e3"} Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.186215 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8kqgz-config-mc26p" podStartSLOduration=2.186191286 podStartE2EDuration="2.186191286s" podCreationTimestamp="2025-10-02 11:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:45.185032774 +0000 UTC m=+1185.735399138" watchObservedRunningTime="2025-10-02 11:29:45.186191286 +0000 UTC m=+1185.736557660" Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.511379 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.646024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj\") pod \"40dec924-bb71-4ac6-ac46-d77725459692\" (UID: \"40dec924-bb71-4ac6-ac46-d77725459692\") " Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.651413 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj" (OuterVolumeSpecName: "kube-api-access-vdpwj") pod "40dec924-bb71-4ac6-ac46-d77725459692" (UID: "40dec924-bb71-4ac6-ac46-d77725459692"). InnerVolumeSpecName "kube-api-access-vdpwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:45 crc kubenswrapper[4929]: I1002 11:29:45.747256 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdpwj\" (UniqueName: \"kubernetes.io/projected/40dec924-bb71-4ac6-ac46-d77725459692-kube-api-access-vdpwj\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.185418 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"ad49c21a672c805ec312d5fb5f9c9032867c22231864156a14347d73f9b26ac2"} Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.185769 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"d463cb431a68d8c8a6bc8838afb25b1c342f4bd84ce2023c1ef8358a6d79a0eb"} Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.185786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerStarted","Data":"1d45e6424e955430dffa5579cf4f5d18c47a7931b6e630ff334c44c39257c19c"} Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.188579 4929 generic.go:334] "Generic (PLEG): container finished" podID="5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" containerID="563132632ec9af4b517dd0c98a94e892d44876a724b6c8146cd51c68809ccdb5" exitCode=0 Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.188645 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-mc26p" event={"ID":"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa","Type":"ContainerDied","Data":"563132632ec9af4b517dd0c98a94e892d44876a724b6c8146cd51c68809ccdb5"} Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.189973 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cda6-account-create-r9q8t" event={"ID":"40dec924-bb71-4ac6-ac46-d77725459692","Type":"ContainerDied","Data":"270da1d93ab8fa33a709f08317fcb6de30c045f1e33b176ddf8961f43f2a9ab1"} Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.190003 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270da1d93ab8fa33a709f08317fcb6de30c045f1e33b176ddf8961f43f2a9ab1" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.190048 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cda6-account-create-r9q8t" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.226362 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.274128145 podStartE2EDuration="32.226341539s" podCreationTimestamp="2025-10-02 11:29:14 +0000 UTC" firstStartedPulling="2025-10-02 11:29:32.071641641 +0000 UTC m=+1172.622008005" lastFinishedPulling="2025-10-02 11:29:44.023855035 +0000 UTC m=+1184.574221399" observedRunningTime="2025-10-02 11:29:46.217310376 +0000 UTC m=+1186.767676740" watchObservedRunningTime="2025-10-02 11:29:46.226341539 +0000 UTC m=+1186.776707913" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.494167 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:29:46 crc kubenswrapper[4929]: E1002 11:29:46.494508 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dec924-bb71-4ac6-ac46-d77725459692" containerName="mariadb-account-create" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.494520 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dec924-bb71-4ac6-ac46-d77725459692" containerName="mariadb-account-create" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.494682 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dec924-bb71-4ac6-ac46-d77725459692" containerName="mariadb-account-create" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.495527 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.497929 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.505073 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.559673 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.559733 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.560059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.560168 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpvs\" (UniqueName: \"kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.560233 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.560278 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661256 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661329 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpvs\" (UniqueName: \"kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661375 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661433 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661473 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.661517 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.662530 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.662628 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.662672 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.662951 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.663317 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.684105 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpvs\" (UniqueName: \"kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs\") pod \"dnsmasq-dns-5c79d794d7-7kp5l\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:46 crc kubenswrapper[4929]: I1002 11:29:46.813651 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:48 crc kubenswrapper[4929]: I1002 11:29:48.773084 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.114102 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-n7hrt"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.115718 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.155870 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-n7hrt"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.204098 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwx6x\" (UniqueName: \"kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x\") pod \"barbican-db-create-n7hrt\" (UID: \"9fcbd13a-778f-42a3-872a-92a03324a4be\") " pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.232829 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-84bfc"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.234008 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.241633 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-84bfc"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.305423 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwx6x\" (UniqueName: \"kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x\") pod \"barbican-db-create-n7hrt\" (UID: \"9fcbd13a-778f-42a3-872a-92a03324a4be\") " pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.305490 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45chx\" (UniqueName: \"kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx\") pod \"cinder-db-create-84bfc\" (UID: \"faf6eee5-d420-48f9-b5d7-e591fc2ba33c\") " pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.322898 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwx6x\" (UniqueName: \"kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x\") pod \"barbican-db-create-n7hrt\" (UID: \"9fcbd13a-778f-42a3-872a-92a03324a4be\") " pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.372945 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kwgvk"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.374042 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.377359 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.377682 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.377705 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9qdpv" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.377719 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.395900 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kwgvk"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.406468 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45chx\" (UniqueName: \"kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx\") pod \"cinder-db-create-84bfc\" (UID: \"faf6eee5-d420-48f9-b5d7-e591fc2ba33c\") " pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.406757 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.406893 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.407051 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5dk\" (UniqueName: \"kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.423173 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45chx\" (UniqueName: \"kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx\") pod \"cinder-db-create-84bfc\" (UID: \"faf6eee5-d420-48f9-b5d7-e591fc2ba33c\") " pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.469760 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.498041 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2vplt"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.499159 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.508197 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.508243 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5dk\" (UniqueName: \"kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.508362 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.509508 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2vplt"] Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.511800 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.521314 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.555377 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.576636 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5dk\" (UniqueName: \"kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk\") pod \"keystone-db-sync-kwgvk\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.610079 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qb4z\" (UniqueName: \"kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z\") pod \"neutron-db-create-2vplt\" (UID: \"e10e697f-2b74-44fd-9645-f996885417e5\") " pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.702503 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.711945 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qb4z\" (UniqueName: \"kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z\") pod \"neutron-db-create-2vplt\" (UID: \"e10e697f-2b74-44fd-9645-f996885417e5\") " pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.742410 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qb4z\" (UniqueName: \"kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z\") pod \"neutron-db-create-2vplt\" (UID: \"e10e697f-2b74-44fd-9645-f996885417e5\") " pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:49 crc kubenswrapper[4929]: I1002 11:29:49.871938 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.082164 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183230 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183566 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183597 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183716 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9qbd\" (UniqueName: \"kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183734 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.183758 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run\") pod \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\" (UID: \"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa\") " Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.184098 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run" (OuterVolumeSpecName: "var-run") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.184755 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.185593 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.188451 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts" (OuterVolumeSpecName: "scripts") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.188662 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.195911 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd" (OuterVolumeSpecName: "kube-api-access-c9qbd") pod "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" (UID: "5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa"). InnerVolumeSpecName "kube-api-access-c9qbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.259551 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz-config-mc26p" event={"ID":"5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa","Type":"ContainerDied","Data":"a127ae6d94a156555b7499ead552e88d144a114768c83e62a2f51963d596f66d"} Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.259598 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a127ae6d94a156555b7499ead552e88d144a114768c83e62a2f51963d596f66d" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.259663 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz-config-mc26p" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287213 4929 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287251 4929 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287264 4929 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287272 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9qbd\" (UniqueName: \"kubernetes.io/projected/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-kube-api-access-c9qbd\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287283 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.287292 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.386181 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.485265 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kwgvk"] Oct 02 11:29:54 crc kubenswrapper[4929]: W1002 11:29:54.493518 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7812df5_741a_4a01_b77d_6b80b4e71090.slice/crio-cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580 WatchSource:0}: Error finding container cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580: Status 404 returned error can't find the container with id cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580 Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.566008 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-84bfc"] Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.582171 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-n7hrt"] Oct 02 11:29:54 crc kubenswrapper[4929]: I1002 11:29:54.587799 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2vplt"] Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.172641 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8kqgz-config-mc26p"] Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.187923 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8kqgz-config-mc26p"] Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.314341 4929 generic.go:334] "Generic (PLEG): container finished" podID="e10e697f-2b74-44fd-9645-f996885417e5" containerID="545256c2e5a3d079c962d1cd6d366f7f7284a0b50af8728c42f20445f2a074fc" exitCode=0 Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.314510 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2vplt" event={"ID":"e10e697f-2b74-44fd-9645-f996885417e5","Type":"ContainerDied","Data":"545256c2e5a3d079c962d1cd6d366f7f7284a0b50af8728c42f20445f2a074fc"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.314543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2vplt" event={"ID":"e10e697f-2b74-44fd-9645-f996885417e5","Type":"ContainerStarted","Data":"98a7eac1295a2df796a47dee67fba2116738fc28ce6198015631eaaefd4d96d8"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.322300 4929 generic.go:334] "Generic (PLEG): container finished" podID="9fcbd13a-778f-42a3-872a-92a03324a4be" containerID="82486077057b1ee0fcab31077dd4225d2021103abf86b2dbd1c7cfe1ea382477" exitCode=0 Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.322603 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n7hrt" event={"ID":"9fcbd13a-778f-42a3-872a-92a03324a4be","Type":"ContainerDied","Data":"82486077057b1ee0fcab31077dd4225d2021103abf86b2dbd1c7cfe1ea382477"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.322715 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n7hrt" event={"ID":"9fcbd13a-778f-42a3-872a-92a03324a4be","Type":"ContainerStarted","Data":"97f947b48b5ab1b0fa74a5abbbb2f56262f24a50f8e7a104f061b62afeb46bba"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.334412 4929 generic.go:334] "Generic (PLEG): container finished" podID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerID="0650928908ebca0a5696a7f0cfb760a2127509721b37e8cfd794979564b4fcc2" exitCode=0 Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.334508 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" event={"ID":"da7b43cd-cf10-4976-8a41-68179fbc3c64","Type":"ContainerDied","Data":"0650928908ebca0a5696a7f0cfb760a2127509721b37e8cfd794979564b4fcc2"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.334562 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" event={"ID":"da7b43cd-cf10-4976-8a41-68179fbc3c64","Type":"ContainerStarted","Data":"8e5af56285a8569566d213762118b055d5ebab65cdeee3c652533b079fb60ad7"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.342752 4929 generic.go:334] "Generic (PLEG): container finished" podID="faf6eee5-d420-48f9-b5d7-e591fc2ba33c" containerID="58658af3353a3f18bc660e889c500b662caf020ac76f0cd05852a0df879fac0b" exitCode=0 Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.342850 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-84bfc" event={"ID":"faf6eee5-d420-48f9-b5d7-e591fc2ba33c","Type":"ContainerDied","Data":"58658af3353a3f18bc660e889c500b662caf020ac76f0cd05852a0df879fac0b"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.342877 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-84bfc" event={"ID":"faf6eee5-d420-48f9-b5d7-e591fc2ba33c","Type":"ContainerStarted","Data":"5f453b68f714338772d9b99bb574755a1995ba73684e7acd287697623f62b5fe"} Oct 02 11:29:55 crc kubenswrapper[4929]: I1002 11:29:55.344746 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kwgvk" event={"ID":"e7812df5-741a-4a01-b77d-6b80b4e71090","Type":"ContainerStarted","Data":"cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580"} Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.182436 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" path="/var/lib/kubelet/pods/5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa/volumes" Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.352384 4929 generic.go:334] "Generic (PLEG): container finished" podID="de08030d-c53a-49cc-906e-10b22cd577e1" containerID="1746635e6e6644889048b899323774c25ad20cec9c641616c1b701921c91a205" exitCode=0 Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.352450 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6zpcs" event={"ID":"de08030d-c53a-49cc-906e-10b22cd577e1","Type":"ContainerDied","Data":"1746635e6e6644889048b899323774c25ad20cec9c641616c1b701921c91a205"} Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.355463 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" event={"ID":"da7b43cd-cf10-4976-8a41-68179fbc3c64","Type":"ContainerStarted","Data":"56b7571e6f15b543e6d31bfc9bba2f675c81efab51bbc2c75f1ff5add51e2423"} Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.391670 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" podStartSLOduration=10.39165152 podStartE2EDuration="10.39165152s" podCreationTimestamp="2025-10-02 11:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:29:56.386412479 +0000 UTC m=+1196.936778833" watchObservedRunningTime="2025-10-02 11:29:56.39165152 +0000 UTC m=+1196.942017884" Oct 02 11:29:56 crc kubenswrapper[4929]: I1002 11:29:56.818321 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.094069 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.103822 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.115228 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.149987 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.215902 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwx6x\" (UniqueName: \"kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x\") pod \"9fcbd13a-778f-42a3-872a-92a03324a4be\" (UID: \"9fcbd13a-778f-42a3-872a-92a03324a4be\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.215998 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45chx\" (UniqueName: \"kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx\") pod \"faf6eee5-d420-48f9-b5d7-e591fc2ba33c\" (UID: \"faf6eee5-d420-48f9-b5d7-e591fc2ba33c\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.216021 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data\") pod \"de08030d-c53a-49cc-906e-10b22cd577e1\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.216062 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqn9p\" (UniqueName: \"kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p\") pod \"de08030d-c53a-49cc-906e-10b22cd577e1\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.216162 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle\") pod \"de08030d-c53a-49cc-906e-10b22cd577e1\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.216217 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qb4z\" (UniqueName: \"kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z\") pod \"e10e697f-2b74-44fd-9645-f996885417e5\" (UID: \"e10e697f-2b74-44fd-9645-f996885417e5\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.216266 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data\") pod \"de08030d-c53a-49cc-906e-10b22cd577e1\" (UID: \"de08030d-c53a-49cc-906e-10b22cd577e1\") " Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.221261 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x" (OuterVolumeSpecName: "kube-api-access-pwx6x") pod "9fcbd13a-778f-42a3-872a-92a03324a4be" (UID: "9fcbd13a-778f-42a3-872a-92a03324a4be"). InnerVolumeSpecName "kube-api-access-pwx6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.221339 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z" (OuterVolumeSpecName: "kube-api-access-2qb4z") pod "e10e697f-2b74-44fd-9645-f996885417e5" (UID: "e10e697f-2b74-44fd-9645-f996885417e5"). InnerVolumeSpecName "kube-api-access-2qb4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.221354 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx" (OuterVolumeSpecName: "kube-api-access-45chx") pod "faf6eee5-d420-48f9-b5d7-e591fc2ba33c" (UID: "faf6eee5-d420-48f9-b5d7-e591fc2ba33c"). InnerVolumeSpecName "kube-api-access-45chx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.223116 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p" (OuterVolumeSpecName: "kube-api-access-nqn9p") pod "de08030d-c53a-49cc-906e-10b22cd577e1" (UID: "de08030d-c53a-49cc-906e-10b22cd577e1"). InnerVolumeSpecName "kube-api-access-nqn9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.229220 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "de08030d-c53a-49cc-906e-10b22cd577e1" (UID: "de08030d-c53a-49cc-906e-10b22cd577e1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.240976 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de08030d-c53a-49cc-906e-10b22cd577e1" (UID: "de08030d-c53a-49cc-906e-10b22cd577e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.269887 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data" (OuterVolumeSpecName: "config-data") pod "de08030d-c53a-49cc-906e-10b22cd577e1" (UID: "de08030d-c53a-49cc-906e-10b22cd577e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317680 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317717 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qb4z\" (UniqueName: \"kubernetes.io/projected/e10e697f-2b74-44fd-9645-f996885417e5-kube-api-access-2qb4z\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317730 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317739 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwx6x\" (UniqueName: \"kubernetes.io/projected/9fcbd13a-778f-42a3-872a-92a03324a4be-kube-api-access-pwx6x\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317748 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45chx\" (UniqueName: \"kubernetes.io/projected/faf6eee5-d420-48f9-b5d7-e591fc2ba33c-kube-api-access-45chx\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317757 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/de08030d-c53a-49cc-906e-10b22cd577e1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.317765 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqn9p\" (UniqueName: \"kubernetes.io/projected/de08030d-c53a-49cc-906e-10b22cd577e1-kube-api-access-nqn9p\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.393910 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6zpcs" event={"ID":"de08030d-c53a-49cc-906e-10b22cd577e1","Type":"ContainerDied","Data":"cb0b4cba2cf64557042db7bdefe99d20453726e15ccc31b74740925669bbee4e"} Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.394019 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0b4cba2cf64557042db7bdefe99d20453726e15ccc31b74740925669bbee4e" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.393921 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6zpcs" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.395368 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-84bfc" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.395389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-84bfc" event={"ID":"faf6eee5-d420-48f9-b5d7-e591fc2ba33c","Type":"ContainerDied","Data":"5f453b68f714338772d9b99bb574755a1995ba73684e7acd287697623f62b5fe"} Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.395424 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f453b68f714338772d9b99bb574755a1995ba73684e7acd287697623f62b5fe" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.396800 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kwgvk" event={"ID":"e7812df5-741a-4a01-b77d-6b80b4e71090","Type":"ContainerStarted","Data":"bd2c5dbd1a7a392222669a7ca1331d0bc09ddb112ba5fed20e7f76db48ff7eed"} Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.398204 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2vplt" event={"ID":"e10e697f-2b74-44fd-9645-f996885417e5","Type":"ContainerDied","Data":"98a7eac1295a2df796a47dee67fba2116738fc28ce6198015631eaaefd4d96d8"} Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.398255 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98a7eac1295a2df796a47dee67fba2116738fc28ce6198015631eaaefd4d96d8" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.398370 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2vplt" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.399576 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n7hrt" event={"ID":"9fcbd13a-778f-42a3-872a-92a03324a4be","Type":"ContainerDied","Data":"97f947b48b5ab1b0fa74a5abbbb2f56262f24a50f8e7a104f061b62afeb46bba"} Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.399598 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f947b48b5ab1b0fa74a5abbbb2f56262f24a50f8e7a104f061b62afeb46bba" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.399900 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n7hrt" Oct 02 11:29:59 crc kubenswrapper[4929]: I1002 11:29:59.431944 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kwgvk" podStartSLOduration=6.041584628 podStartE2EDuration="10.431926463s" podCreationTimestamp="2025-10-02 11:29:49 +0000 UTC" firstStartedPulling="2025-10-02 11:29:54.494898221 +0000 UTC m=+1195.045264585" lastFinishedPulling="2025-10-02 11:29:58.885240056 +0000 UTC m=+1199.435606420" observedRunningTime="2025-10-02 11:29:59.427068003 +0000 UTC m=+1199.977434387" watchObservedRunningTime="2025-10-02 11:29:59.431926463 +0000 UTC m=+1199.982292827" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.138474 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk"] Oct 02 11:30:00 crc kubenswrapper[4929]: E1002 11:30:00.139132 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcbd13a-778f-42a3-872a-92a03324a4be" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139145 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcbd13a-778f-42a3-872a-92a03324a4be" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: E1002 11:30:00.139160 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" containerName="ovn-config" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139165 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" containerName="ovn-config" Oct 02 11:30:00 crc kubenswrapper[4929]: E1002 11:30:00.139182 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf6eee5-d420-48f9-b5d7-e591fc2ba33c" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139188 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf6eee5-d420-48f9-b5d7-e591fc2ba33c" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: E1002 11:30:00.139206 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10e697f-2b74-44fd-9645-f996885417e5" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139211 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10e697f-2b74-44fd-9645-f996885417e5" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: E1002 11:30:00.139225 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de08030d-c53a-49cc-906e-10b22cd577e1" containerName="glance-db-sync" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139231 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="de08030d-c53a-49cc-906e-10b22cd577e1" containerName="glance-db-sync" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139370 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10e697f-2b74-44fd-9645-f996885417e5" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139387 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="de08030d-c53a-49cc-906e-10b22cd577e1" containerName="glance-db-sync" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139396 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcbd13a-778f-42a3-872a-92a03324a4be" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139407 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5650b8f3-92c5-4f35-9298-6cc3b5ea9bfa" containerName="ovn-config" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139417 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf6eee5-d420-48f9-b5d7-e591fc2ba33c" containerName="mariadb-database-create" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.139989 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.141845 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.142445 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.176651 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk"] Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.252950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvp9\" (UniqueName: \"kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.253161 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.253226 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.355279 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.355592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.355718 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvp9\" (UniqueName: \"kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.360792 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.361371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.388695 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvp9\" (UniqueName: \"kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9\") pod \"collect-profiles-29323410-5csgk\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.495895 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.585289 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.585745 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="dnsmasq-dns" containerID="cri-o://56b7571e6f15b543e6d31bfc9bba2f675c81efab51bbc2c75f1ff5add51e2423" gracePeriod=10 Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.588367 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.627874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.629242 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.683429 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.771889 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.771984 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.772059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.772148 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.772199 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.772318 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjc4\" (UniqueName: \"kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.875787 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.875861 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.875904 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjc4\" (UniqueName: \"kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.875927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.875948 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.876011 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.877633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.879265 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.882658 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.884243 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.885037 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.901607 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjc4\" (UniqueName: \"kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4\") pod \"dnsmasq-dns-5f59b8f679-pwd5b\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:00 crc kubenswrapper[4929]: I1002 11:30:00.912623 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk"] Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.078735 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.307730 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.426728 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" event={"ID":"3d427bb5-77e9-420b-aa34-52fa95ae93b1","Type":"ContainerStarted","Data":"cfef63239e48f4bab8a8ef42f32455bde329eae903a0cefb60fc43115a4f34f5"} Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.426771 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" event={"ID":"3d427bb5-77e9-420b-aa34-52fa95ae93b1","Type":"ContainerStarted","Data":"9e1607aa984ea05850bfbd9e3c45450989c856fc93f1be6f544eda703b338a8b"} Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.429096 4929 generic.go:334] "Generic (PLEG): container finished" podID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerID="56b7571e6f15b543e6d31bfc9bba2f675c81efab51bbc2c75f1ff5add51e2423" exitCode=0 Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.429145 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" event={"ID":"da7b43cd-cf10-4976-8a41-68179fbc3c64","Type":"ContainerDied","Data":"56b7571e6f15b543e6d31bfc9bba2f675c81efab51bbc2c75f1ff5add51e2423"} Oct 02 11:30:01 crc kubenswrapper[4929]: I1002 11:30:01.430411 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" event={"ID":"07dae957-12b7-4d3c-8132-225c52efdbe0","Type":"ContainerStarted","Data":"9ae7e7fdf12ac8ae8e53c948e70104037ba22f8f6e8fcb2a8a88b04c58b7c166"} Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.115414 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200082 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200454 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200491 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200552 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200579 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.200732 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dpvs\" (UniqueName: \"kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs\") pod \"da7b43cd-cf10-4976-8a41-68179fbc3c64\" (UID: \"da7b43cd-cf10-4976-8a41-68179fbc3c64\") " Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.220337 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs" (OuterVolumeSpecName: "kube-api-access-9dpvs") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "kube-api-access-9dpvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.252899 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.252978 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.256380 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config" (OuterVolumeSpecName: "config") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.260528 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.270593 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da7b43cd-cf10-4976-8a41-68179fbc3c64" (UID: "da7b43cd-cf10-4976-8a41-68179fbc3c64"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302417 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dpvs\" (UniqueName: \"kubernetes.io/projected/da7b43cd-cf10-4976-8a41-68179fbc3c64-kube-api-access-9dpvs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302613 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302667 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302715 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302802 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.302853 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7b43cd-cf10-4976-8a41-68179fbc3c64-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.438988 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.438950 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" event={"ID":"da7b43cd-cf10-4976-8a41-68179fbc3c64","Type":"ContainerDied","Data":"8e5af56285a8569566d213762118b055d5ebab65cdeee3c652533b079fb60ad7"} Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.440102 4929 scope.go:117] "RemoveContainer" containerID="56b7571e6f15b543e6d31bfc9bba2f675c81efab51bbc2c75f1ff5add51e2423" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.451322 4929 generic.go:334] "Generic (PLEG): container finished" podID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerID="ef49351a5706ea2bd9bb79ed3eb3536f18d388bdf3c71250f0ddd4a7feb26017" exitCode=0 Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.451439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" event={"ID":"07dae957-12b7-4d3c-8132-225c52efdbe0","Type":"ContainerDied","Data":"ef49351a5706ea2bd9bb79ed3eb3536f18d388bdf3c71250f0ddd4a7feb26017"} Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.455396 4929 generic.go:334] "Generic (PLEG): container finished" podID="3d427bb5-77e9-420b-aa34-52fa95ae93b1" containerID="cfef63239e48f4bab8a8ef42f32455bde329eae903a0cefb60fc43115a4f34f5" exitCode=0 Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.455442 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" event={"ID":"3d427bb5-77e9-420b-aa34-52fa95ae93b1","Type":"ContainerDied","Data":"cfef63239e48f4bab8a8ef42f32455bde329eae903a0cefb60fc43115a4f34f5"} Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.491262 4929 scope.go:117] "RemoveContainer" containerID="0650928908ebca0a5696a7f0cfb760a2127509721b37e8cfd794979564b4fcc2" Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.505422 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:30:02 crc kubenswrapper[4929]: I1002 11:30:02.514131 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7kp5l"] Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.466200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" event={"ID":"07dae957-12b7-4d3c-8132-225c52efdbe0","Type":"ContainerStarted","Data":"ada30349953795951085122baf66aaa69533e72a60aebeccf1ba488cc4a936ee"} Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.494888 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" podStartSLOduration=3.49486082 podStartE2EDuration="3.49486082s" podCreationTimestamp="2025-10-02 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:03.486386322 +0000 UTC m=+1204.036752686" watchObservedRunningTime="2025-10-02 11:30:03.49486082 +0000 UTC m=+1204.045227204" Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.812769 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.934073 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume\") pod \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.934261 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbvp9\" (UniqueName: \"kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9\") pod \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.935224 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume\") pod \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\" (UID: \"3d427bb5-77e9-420b-aa34-52fa95ae93b1\") " Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.938237 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "3d427bb5-77e9-420b-aa34-52fa95ae93b1" (UID: "3d427bb5-77e9-420b-aa34-52fa95ae93b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.944040 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9" (OuterVolumeSpecName: "kube-api-access-kbvp9") pod "3d427bb5-77e9-420b-aa34-52fa95ae93b1" (UID: "3d427bb5-77e9-420b-aa34-52fa95ae93b1"). InnerVolumeSpecName "kube-api-access-kbvp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4929]: I1002 11:30:03.951372 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3d427bb5-77e9-420b-aa34-52fa95ae93b1" (UID: "3d427bb5-77e9-420b-aa34-52fa95ae93b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.037858 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbvp9\" (UniqueName: \"kubernetes.io/projected/3d427bb5-77e9-420b-aa34-52fa95ae93b1-kube-api-access-kbvp9\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.037931 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d427bb5-77e9-420b-aa34-52fa95ae93b1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.037946 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d427bb5-77e9-420b-aa34-52fa95ae93b1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.167131 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" path="/var/lib/kubelet/pods/da7b43cd-cf10-4976-8a41-68179fbc3c64/volumes" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.475394 4929 generic.go:334] "Generic (PLEG): container finished" podID="e7812df5-741a-4a01-b77d-6b80b4e71090" containerID="bd2c5dbd1a7a392222669a7ca1331d0bc09ddb112ba5fed20e7f76db48ff7eed" exitCode=0 Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.475449 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kwgvk" event={"ID":"e7812df5-741a-4a01-b77d-6b80b4e71090","Type":"ContainerDied","Data":"bd2c5dbd1a7a392222669a7ca1331d0bc09ddb112ba5fed20e7f76db48ff7eed"} Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.478022 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.478395 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk" event={"ID":"3d427bb5-77e9-420b-aa34-52fa95ae93b1","Type":"ContainerDied","Data":"9e1607aa984ea05850bfbd9e3c45450989c856fc93f1be6f544eda703b338a8b"} Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.478416 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1607aa984ea05850bfbd9e3c45450989c856fc93f1be6f544eda703b338a8b" Oct 02 11:30:04 crc kubenswrapper[4929]: I1002 11:30:04.478431 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.816652 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.870626 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5dk\" (UniqueName: \"kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk\") pod \"e7812df5-741a-4a01-b77d-6b80b4e71090\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.870687 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data\") pod \"e7812df5-741a-4a01-b77d-6b80b4e71090\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.870870 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle\") pod \"e7812df5-741a-4a01-b77d-6b80b4e71090\" (UID: \"e7812df5-741a-4a01-b77d-6b80b4e71090\") " Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.883210 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk" (OuterVolumeSpecName: "kube-api-access-8w5dk") pod "e7812df5-741a-4a01-b77d-6b80b4e71090" (UID: "e7812df5-741a-4a01-b77d-6b80b4e71090"). InnerVolumeSpecName "kube-api-access-8w5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.898600 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7812df5-741a-4a01-b77d-6b80b4e71090" (UID: "e7812df5-741a-4a01-b77d-6b80b4e71090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.924241 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data" (OuterVolumeSpecName: "config-data") pod "e7812df5-741a-4a01-b77d-6b80b4e71090" (UID: "e7812df5-741a-4a01-b77d-6b80b4e71090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.972447 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5dk\" (UniqueName: \"kubernetes.io/projected/e7812df5-741a-4a01-b77d-6b80b4e71090-kube-api-access-8w5dk\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.972480 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:05 crc kubenswrapper[4929]: I1002 11:30:05.972492 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7812df5-741a-4a01-b77d-6b80b4e71090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.494884 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kwgvk" event={"ID":"e7812df5-741a-4a01-b77d-6b80b4e71090","Type":"ContainerDied","Data":"cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580"} Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.495251 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb785cc34ba10d8c1064c4d9d0520b8ba482cbe105d6986ae734c2ed50f49580" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.494953 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kwgvk" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.758744 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.761927 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="dnsmasq-dns" containerID="cri-o://ada30349953795951085122baf66aaa69533e72a60aebeccf1ba488cc4a936ee" gracePeriod=10 Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794001 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hqscp"] Oct 02 11:30:06 crc kubenswrapper[4929]: E1002 11:30:06.794411 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="dnsmasq-dns" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794425 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="dnsmasq-dns" Oct 02 11:30:06 crc kubenswrapper[4929]: E1002 11:30:06.794445 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d427bb5-77e9-420b-aa34-52fa95ae93b1" containerName="collect-profiles" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794455 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d427bb5-77e9-420b-aa34-52fa95ae93b1" containerName="collect-profiles" Oct 02 11:30:06 crc kubenswrapper[4929]: E1002 11:30:06.794469 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="init" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794477 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="init" Oct 02 11:30:06 crc kubenswrapper[4929]: E1002 11:30:06.794498 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7812df5-741a-4a01-b77d-6b80b4e71090" containerName="keystone-db-sync" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794504 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7812df5-741a-4a01-b77d-6b80b4e71090" containerName="keystone-db-sync" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794668 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="dnsmasq-dns" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794680 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d427bb5-77e9-420b-aa34-52fa95ae93b1" containerName="collect-profiles" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.794712 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7812df5-741a-4a01-b77d-6b80b4e71090" containerName="keystone-db-sync" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.795254 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.799565 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.799591 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.799645 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.799674 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.799706 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9qdpv" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.800886 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.816609 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-7kp5l" podUID="da7b43cd-cf10-4976-8a41-68179fbc3c64" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.820443 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hqscp"] Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.826270 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898358 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898408 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898441 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898520 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898548 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898642 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898708 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898749 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898799 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvgr\" (UniqueName: \"kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898832 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898854 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lb78\" (UniqueName: \"kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:06 crc kubenswrapper[4929]: I1002 11:30:06.898881 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000252 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000581 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvgr\" (UniqueName: \"kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000610 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lb78\" (UniqueName: \"kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000629 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000658 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000695 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000715 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000744 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000788 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000814 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000864 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.000905 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.001102 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.001659 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.006621 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.007969 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.008088 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.008137 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.011344 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.011689 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.013128 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.019771 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.031646 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvgr\" (UniqueName: \"kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr\") pod \"keystone-bootstrap-hqscp\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.037715 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lb78\" (UniqueName: \"kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78\") pod \"dnsmasq-dns-bbf5cc879-7gp7d\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.039423 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.041570 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.044423 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.044674 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.055535 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102120 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102172 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102193 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvv5m\" (UniqueName: \"kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102224 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102254 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102289 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.102320 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.115552 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.127613 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.179489 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.205927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.209900 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-j8b9x"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211314 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211471 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211583 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211612 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvv5m\" (UniqueName: \"kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211678 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211746 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.211824 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.212322 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.216556 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.219004 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gt6zf" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.219020 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.219292 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.226193 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.226633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.234755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.235642 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.243739 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvv5m\" (UniqueName: \"kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m\") pod \"ceilometer-0\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.247069 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.248837 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.265479 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-j8b9x"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.291282 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316699 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mpgn\" (UniqueName: \"kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316740 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316780 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316804 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316822 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316849 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316869 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316894 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2lrg\" (UniqueName: \"kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316911 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.316944 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.317016 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.410378 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.419597 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.420480 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mpgn\" (UniqueName: \"kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.420562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.421598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.422408 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.422456 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.422512 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.422537 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423265 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423312 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423364 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2lrg\" (UniqueName: \"kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423369 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423398 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423502 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423262 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.423921 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.424472 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.424512 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.426632 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.426898 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.448728 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mpgn\" (UniqueName: \"kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn\") pod \"dnsmasq-dns-56df8fb6b7-mh9sd\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.452530 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2lrg\" (UniqueName: \"kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg\") pod \"placement-db-sync-j8b9x\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.494322 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hqscp"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.536234 4929 generic.go:334] "Generic (PLEG): container finished" podID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerID="ada30349953795951085122baf66aaa69533e72a60aebeccf1ba488cc4a936ee" exitCode=0 Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.536337 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" event={"ID":"07dae957-12b7-4d3c-8132-225c52efdbe0","Type":"ContainerDied","Data":"ada30349953795951085122baf66aaa69533e72a60aebeccf1ba488cc4a936ee"} Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.557353 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.569315 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.582333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hqscp" event={"ID":"1419c407-f3ab-4214-8d8a-fc76d15b322e","Type":"ContainerStarted","Data":"79d1a901f343f31fb92a6ec52d277e1cdba48ded27a1571bec560888f491dd85"} Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.757502 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.911653 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.913511 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.924186 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9kk67" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.924360 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.924993 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.925117 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933341 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933380 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933405 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933429 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvbf\" (UniqueName: \"kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933472 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933556 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933825 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.933988 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.936226 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.985569 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.988056 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.991668 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:30:07 crc kubenswrapper[4929]: I1002 11:30:07.991855 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.001188 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.028568 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.038793 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.038898 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039470 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039541 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039559 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039577 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039596 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvbf\" (UniqueName: \"kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039622 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039665 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.039903 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.040344 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.043849 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.046075 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.048435 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.054621 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.071389 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvbf\" (UniqueName: \"kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.096915 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.110743 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.140774 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.140877 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141193 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141225 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjc4\" (UniqueName: \"kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141250 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141289 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config\") pod \"07dae957-12b7-4d3c-8132-225c52efdbe0\" (UID: \"07dae957-12b7-4d3c-8132-225c52efdbe0\") " Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141670 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141723 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141740 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkcbk\" (UniqueName: \"kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141794 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141814 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141838 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141866 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.141903 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.156167 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4" (OuterVolumeSpecName: "kube-api-access-lrjc4") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "kube-api-access-lrjc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.192007 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.211814 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.224462 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config" (OuterVolumeSpecName: "config") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.236320 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.237110 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07dae957-12b7-4d3c-8132-225c52efdbe0" (UID: "07dae957-12b7-4d3c-8132-225c52efdbe0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.243334 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244149 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244234 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkcbk\" (UniqueName: \"kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244327 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244429 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244535 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.244643 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245180 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245715 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245802 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrjc4\" (UniqueName: \"kubernetes.io/projected/07dae957-12b7-4d3c-8132-225c52efdbe0-kube-api-access-lrjc4\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245890 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245970 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.246032 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.246084 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07dae957-12b7-4d3c-8132-225c52efdbe0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245596 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245093 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.245515 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.254922 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.255585 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.255704 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.267106 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkcbk\" (UniqueName: \"kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.272886 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-j8b9x"] Oct 02 11:30:08 crc kubenswrapper[4929]: W1002 11:30:08.278578 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd73a449d_0b0f_40a9_9cc7_5e44447b2c86.slice/crio-989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108 WatchSource:0}: Error finding container 989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108: Status 404 returned error can't find the container with id 989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108 Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.281393 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.294721 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.306172 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.322520 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.570794 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.595713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hqscp" event={"ID":"1419c407-f3ab-4214-8d8a-fc76d15b322e","Type":"ContainerStarted","Data":"95f2131a5f9718323b706cfe761be1d34f7ebf9f5b078a46bf5ca95158bec8a3"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.600458 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j8b9x" event={"ID":"d73a449d-0b0f-40a9-9cc7-5e44447b2c86","Type":"ContainerStarted","Data":"989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.607380 4929 generic.go:334] "Generic (PLEG): container finished" podID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerID="6f688d9ddf62796d832ca0848dbf1c98b55fce41f99c8e828191b4d34de8fa8d" exitCode=0 Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.607476 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" event={"ID":"471329a2-ca2f-4ba2-b750-32f88de79c8f","Type":"ContainerDied","Data":"6f688d9ddf62796d832ca0848dbf1c98b55fce41f99c8e828191b4d34de8fa8d"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.607503 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" event={"ID":"471329a2-ca2f-4ba2-b750-32f88de79c8f","Type":"ContainerStarted","Data":"1c844c4fb650f96541a30efba52b4f503a21085a0c0b23e8e559591d823b5b02"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.618535 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hqscp" podStartSLOduration=2.618514393 podStartE2EDuration="2.618514393s" podCreationTimestamp="2025-10-02 11:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:08.614036503 +0000 UTC m=+1209.164402877" watchObservedRunningTime="2025-10-02 11:30:08.618514393 +0000 UTC m=+1209.168880757" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.625395 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" event={"ID":"07dae957-12b7-4d3c-8132-225c52efdbe0","Type":"ContainerDied","Data":"9ae7e7fdf12ac8ae8e53c948e70104037ba22f8f6e8fcb2a8a88b04c58b7c166"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.625584 4929 scope.go:117] "RemoveContainer" containerID="ada30349953795951085122baf66aaa69533e72a60aebeccf1ba488cc4a936ee" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.625760 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwd5b" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.650064 4929 generic.go:334] "Generic (PLEG): container finished" podID="df66b312-cfee-4f1b-b547-d0e334ea6191" containerID="a70ec54b140e9f051d6771c05fe9aa7ff68a9c775503aa89f8bdccfda966e6d9" exitCode=0 Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.651143 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" event={"ID":"df66b312-cfee-4f1b-b547-d0e334ea6191","Type":"ContainerDied","Data":"a70ec54b140e9f051d6771c05fe9aa7ff68a9c775503aa89f8bdccfda966e6d9"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.651741 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" event={"ID":"df66b312-cfee-4f1b-b547-d0e334ea6191","Type":"ContainerStarted","Data":"d440c4f6addf02211d79473d811388e9f98496dab5f19a05b41b448cffe25e9f"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.665208 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerStarted","Data":"bfab6c0655342def95d494c6d46a7dfedf508f550ec4fd68eb90927ff2593dbb"} Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.668825 4929 scope.go:117] "RemoveContainer" containerID="ef49351a5706ea2bd9bb79ed3eb3536f18d388bdf3c71250f0ddd4a7feb26017" Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.674179 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.684358 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwd5b"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.838653 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:08 crc kubenswrapper[4929]: I1002 11:30:08.956197 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.037181 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.085116 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.179127 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0fa5-account-create-tnbhh"] Oct 02 11:30:09 crc kubenswrapper[4929]: E1002 11:30:09.179696 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="dnsmasq-dns" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.179707 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="dnsmasq-dns" Oct 02 11:30:09 crc kubenswrapper[4929]: E1002 11:30:09.179717 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="init" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.179723 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="init" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.179890 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" containerName="dnsmasq-dns" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.191795 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.195238 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.204682 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0fa5-account-create-tnbhh"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.252333 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:09 crc kubenswrapper[4929]: W1002 11:30:09.272553 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b993d0d_51d0_4ead_82d1_ef7a7f22a65a.slice/crio-bc9e4be6d1bebab791c91c5b78aabc7aac6e600d4a137d1d59fe8f35318ecef2 WatchSource:0}: Error finding container bc9e4be6d1bebab791c91c5b78aabc7aac6e600d4a137d1d59fe8f35318ecef2: Status 404 returned error can't find the container with id bc9e4be6d1bebab791c91c5b78aabc7aac6e600d4a137d1d59fe8f35318ecef2 Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.274393 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzjrm\" (UniqueName: \"kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm\") pod \"cinder-0fa5-account-create-tnbhh\" (UID: \"444e3b63-3bc5-4324-8431-53991c7c7256\") " pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.278768 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.375849 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.375915 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.375948 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.375991 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lb78\" (UniqueName: \"kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.376019 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.376042 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc\") pod \"df66b312-cfee-4f1b-b547-d0e334ea6191\" (UID: \"df66b312-cfee-4f1b-b547-d0e334ea6191\") " Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.376309 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzjrm\" (UniqueName: \"kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm\") pod \"cinder-0fa5-account-create-tnbhh\" (UID: \"444e3b63-3bc5-4324-8431-53991c7c7256\") " pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.377600 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-9b10-account-create-672ps"] Oct 02 11:30:09 crc kubenswrapper[4929]: E1002 11:30:09.378050 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df66b312-cfee-4f1b-b547-d0e334ea6191" containerName="init" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.378064 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="df66b312-cfee-4f1b-b547-d0e334ea6191" containerName="init" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.378275 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="df66b312-cfee-4f1b-b547-d0e334ea6191" containerName="init" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.378795 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.381788 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.384853 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78" (OuterVolumeSpecName: "kube-api-access-4lb78") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "kube-api-access-4lb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.400806 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzjrm\" (UniqueName: \"kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm\") pod \"cinder-0fa5-account-create-tnbhh\" (UID: \"444e3b63-3bc5-4324-8431-53991c7c7256\") " pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.402246 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9b10-account-create-672ps"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.404791 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.420277 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.420336 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.420376 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config" (OuterVolumeSpecName: "config") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.422738 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df66b312-cfee-4f1b-b547-d0e334ea6191" (UID: "df66b312-cfee-4f1b-b547-d0e334ea6191"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477358 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjjb\" (UniqueName: \"kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb\") pod \"barbican-9b10-account-create-672ps\" (UID: \"d193edfe-fe8e-43fb-a328-7018cd7ab38e\") " pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477467 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477479 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477488 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477496 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lb78\" (UniqueName: \"kubernetes.io/projected/df66b312-cfee-4f1b-b547-d0e334ea6191-kube-api-access-4lb78\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477509 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.477516 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df66b312-cfee-4f1b-b547-d0e334ea6191-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.578222 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b2b-account-create-pjqbx"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.579055 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjjb\" (UniqueName: \"kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb\") pod \"barbican-9b10-account-create-672ps\" (UID: \"d193edfe-fe8e-43fb-a328-7018cd7ab38e\") " pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.579381 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.581849 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.593120 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b2b-account-create-pjqbx"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.620493 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjjb\" (UniqueName: \"kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb\") pod \"barbican-9b10-account-create-672ps\" (UID: \"d193edfe-fe8e-43fb-a328-7018cd7ab38e\") " pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.624094 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.680365 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrc5d\" (UniqueName: \"kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d\") pod \"neutron-7b2b-account-create-pjqbx\" (UID: \"f31e4bd6-d3e3-452e-ba3b-40892b9866e3\") " pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.682065 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" event={"ID":"df66b312-cfee-4f1b-b547-d0e334ea6191","Type":"ContainerDied","Data":"d440c4f6addf02211d79473d811388e9f98496dab5f19a05b41b448cffe25e9f"} Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.682103 4929 scope.go:117] "RemoveContainer" containerID="a70ec54b140e9f051d6771c05fe9aa7ff68a9c775503aa89f8bdccfda966e6d9" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.682203 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-7gp7d" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.689135 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" event={"ID":"471329a2-ca2f-4ba2-b750-32f88de79c8f","Type":"ContainerStarted","Data":"9613e61314e8d8663e2f8f6c661a9cc50f93cc5a9ce125b53a1bb990f3441f17"} Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.689215 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.693033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerStarted","Data":"bc9e4be6d1bebab791c91c5b78aabc7aac6e600d4a137d1d59fe8f35318ecef2"} Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.695658 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerStarted","Data":"1ce445f72a499a66f5cafe7918d6b74106dabfde8841728aeb0d9cf504fb4cb3"} Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.695684 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerStarted","Data":"f0e315de6f1197909277336e3ad729565e95b44cd1b4a690e738f5f965533055"} Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.704937 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.716762 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" podStartSLOduration=2.716728008 podStartE2EDuration="2.716728008s" podCreationTimestamp="2025-10-02 11:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:09.709017581 +0000 UTC m=+1210.259383955" watchObservedRunningTime="2025-10-02 11:30:09.716728008 +0000 UTC m=+1210.267094372" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.758187 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.769966 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-7gp7d"] Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.781540 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrc5d\" (UniqueName: \"kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d\") pod \"neutron-7b2b-account-create-pjqbx\" (UID: \"f31e4bd6-d3e3-452e-ba3b-40892b9866e3\") " pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.801295 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrc5d\" (UniqueName: \"kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d\") pod \"neutron-7b2b-account-create-pjqbx\" (UID: \"f31e4bd6-d3e3-452e-ba3b-40892b9866e3\") " pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:09 crc kubenswrapper[4929]: I1002 11:30:09.932283 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.105985 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0fa5-account-create-tnbhh"] Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.179839 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07dae957-12b7-4d3c-8132-225c52efdbe0" path="/var/lib/kubelet/pods/07dae957-12b7-4d3c-8132-225c52efdbe0/volumes" Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.180713 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df66b312-cfee-4f1b-b547-d0e334ea6191" path="/var/lib/kubelet/pods/df66b312-cfee-4f1b-b547-d0e334ea6191/volumes" Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.297949 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-9b10-account-create-672ps"] Oct 02 11:30:10 crc kubenswrapper[4929]: W1002 11:30:10.312276 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd193edfe_fe8e_43fb_a328_7018cd7ab38e.slice/crio-805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f WatchSource:0}: Error finding container 805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f: Status 404 returned error can't find the container with id 805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.450265 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b2b-account-create-pjqbx"] Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.708169 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerStarted","Data":"353d9fdda9672c972c70396fa7e1a0243c8d9d81256a02028bbd562ca2f71d06"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.713552 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerStarted","Data":"af4bae6289883288ef060fcf7511b383a414720f01848ab83c41c01fa5e5aa01"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.713638 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-log" containerID="cri-o://1ce445f72a499a66f5cafe7918d6b74106dabfde8841728aeb0d9cf504fb4cb3" gracePeriod=30 Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.713696 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-httpd" containerID="cri-o://af4bae6289883288ef060fcf7511b383a414720f01848ab83c41c01fa5e5aa01" gracePeriod=30 Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.718387 4929 generic.go:334] "Generic (PLEG): container finished" podID="444e3b63-3bc5-4324-8431-53991c7c7256" containerID="a54ad1e68998d5eeaffecd2ceb60e4ce78bfa102c02171343d152bc7337ef342" exitCode=0 Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.718461 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0fa5-account-create-tnbhh" event={"ID":"444e3b63-3bc5-4324-8431-53991c7c7256","Type":"ContainerDied","Data":"a54ad1e68998d5eeaffecd2ceb60e4ce78bfa102c02171343d152bc7337ef342"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.718508 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0fa5-account-create-tnbhh" event={"ID":"444e3b63-3bc5-4324-8431-53991c7c7256","Type":"ContainerStarted","Data":"94393cc14624b58cce4a53c25948f7f3c33988765f8fdbe80118d05bdbd998c5"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.721269 4929 generic.go:334] "Generic (PLEG): container finished" podID="d193edfe-fe8e-43fb-a328-7018cd7ab38e" containerID="9dee8e121405f59c5f13c985beb42dd248f4368f1e1001f5114864300ae94b3d" exitCode=0 Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.721447 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b10-account-create-672ps" event={"ID":"d193edfe-fe8e-43fb-a328-7018cd7ab38e","Type":"ContainerDied","Data":"9dee8e121405f59c5f13c985beb42dd248f4368f1e1001f5114864300ae94b3d"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.721474 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b10-account-create-672ps" event={"ID":"d193edfe-fe8e-43fb-a328-7018cd7ab38e","Type":"ContainerStarted","Data":"805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.725790 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b2b-account-create-pjqbx" event={"ID":"f31e4bd6-d3e3-452e-ba3b-40892b9866e3","Type":"ContainerStarted","Data":"c87639312612f5b7e09b8aebb0316b6075f81445245717e51fda5751a3db56c8"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.725915 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b2b-account-create-pjqbx" event={"ID":"f31e4bd6-d3e3-452e-ba3b-40892b9866e3","Type":"ContainerStarted","Data":"c00a6ff8d4f94536c03af0ab42b3409361ce34ddb1d673c84b1ff944a762c89b"} Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.757108 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.757086067 podStartE2EDuration="4.757086067s" podCreationTimestamp="2025-10-02 11:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:10.74045932 +0000 UTC m=+1211.290825694" watchObservedRunningTime="2025-10-02 11:30:10.757086067 +0000 UTC m=+1211.307452431" Oct 02 11:30:10 crc kubenswrapper[4929]: I1002 11:30:10.783428 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b2b-account-create-pjqbx" podStartSLOduration=1.7834112659999999 podStartE2EDuration="1.783411266s" podCreationTimestamp="2025-10-02 11:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:10.775635066 +0000 UTC m=+1211.326001420" watchObservedRunningTime="2025-10-02 11:30:10.783411266 +0000 UTC m=+1211.333777630" Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.736405 4929 generic.go:334] "Generic (PLEG): container finished" podID="1419c407-f3ab-4214-8d8a-fc76d15b322e" containerID="95f2131a5f9718323b706cfe761be1d34f7ebf9f5b078a46bf5ca95158bec8a3" exitCode=0 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.736496 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hqscp" event={"ID":"1419c407-f3ab-4214-8d8a-fc76d15b322e","Type":"ContainerDied","Data":"95f2131a5f9718323b706cfe761be1d34f7ebf9f5b078a46bf5ca95158bec8a3"} Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.738553 4929 generic.go:334] "Generic (PLEG): container finished" podID="f31e4bd6-d3e3-452e-ba3b-40892b9866e3" containerID="c87639312612f5b7e09b8aebb0316b6075f81445245717e51fda5751a3db56c8" exitCode=0 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.738648 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b2b-account-create-pjqbx" event={"ID":"f31e4bd6-d3e3-452e-ba3b-40892b9866e3","Type":"ContainerDied","Data":"c87639312612f5b7e09b8aebb0316b6075f81445245717e51fda5751a3db56c8"} Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.742657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerStarted","Data":"1e1e65801c95faacc5b36b41bc4d34fd8d908ef400a227beb6b3e838a8c71216"} Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.742755 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-log" containerID="cri-o://353d9fdda9672c972c70396fa7e1a0243c8d9d81256a02028bbd562ca2f71d06" gracePeriod=30 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.742782 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-httpd" containerID="cri-o://1e1e65801c95faacc5b36b41bc4d34fd8d908ef400a227beb6b3e838a8c71216" gracePeriod=30 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.746376 4929 generic.go:334] "Generic (PLEG): container finished" podID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerID="af4bae6289883288ef060fcf7511b383a414720f01848ab83c41c01fa5e5aa01" exitCode=143 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.746401 4929 generic.go:334] "Generic (PLEG): container finished" podID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerID="1ce445f72a499a66f5cafe7918d6b74106dabfde8841728aeb0d9cf504fb4cb3" exitCode=143 Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.746485 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerDied","Data":"af4bae6289883288ef060fcf7511b383a414720f01848ab83c41c01fa5e5aa01"} Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.746544 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerDied","Data":"1ce445f72a499a66f5cafe7918d6b74106dabfde8841728aeb0d9cf504fb4cb3"} Oct 02 11:30:11 crc kubenswrapper[4929]: I1002 11:30:11.781343 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.781299192 podStartE2EDuration="5.781299192s" podCreationTimestamp="2025-10-02 11:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:11.774690794 +0000 UTC m=+1212.325057168" watchObservedRunningTime="2025-10-02 11:30:11.781299192 +0000 UTC m=+1212.331665566" Oct 02 11:30:12 crc kubenswrapper[4929]: I1002 11:30:12.757122 4929 generic.go:334] "Generic (PLEG): container finished" podID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerID="1e1e65801c95faacc5b36b41bc4d34fd8d908ef400a227beb6b3e838a8c71216" exitCode=0 Oct 02 11:30:12 crc kubenswrapper[4929]: I1002 11:30:12.757165 4929 generic.go:334] "Generic (PLEG): container finished" podID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerID="353d9fdda9672c972c70396fa7e1a0243c8d9d81256a02028bbd562ca2f71d06" exitCode=143 Oct 02 11:30:12 crc kubenswrapper[4929]: I1002 11:30:12.757244 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerDied","Data":"1e1e65801c95faacc5b36b41bc4d34fd8d908ef400a227beb6b3e838a8c71216"} Oct 02 11:30:12 crc kubenswrapper[4929]: I1002 11:30:12.758031 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerDied","Data":"353d9fdda9672c972c70396fa7e1a0243c8d9d81256a02028bbd562ca2f71d06"} Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.488358 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.606879 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.606932 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.606987 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607010 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tvbf\" (UniqueName: \"kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607051 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607092 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607124 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607173 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\" (UID: \"9dc4a9fe-8ea6-4630-a122-01aad42acf5f\") " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607432 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs" (OuterVolumeSpecName: "logs") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.607650 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.613324 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf" (OuterVolumeSpecName: "kube-api-access-4tvbf") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "kube-api-access-4tvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.614588 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.616180 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts" (OuterVolumeSpecName: "scripts") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.639663 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.668217 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data" (OuterVolumeSpecName: "config-data") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.684282 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9dc4a9fe-8ea6-4630-a122-01aad42acf5f" (UID: "9dc4a9fe-8ea6-4630-a122-01aad42acf5f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713630 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713870 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713921 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713945 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713977 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tvbf\" (UniqueName: \"kubernetes.io/projected/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-kube-api-access-4tvbf\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.713991 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.714006 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.714019 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dc4a9fe-8ea6-4630-a122-01aad42acf5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.729998 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.775126 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9dc4a9fe-8ea6-4630-a122-01aad42acf5f","Type":"ContainerDied","Data":"f0e315de6f1197909277336e3ad729565e95b44cd1b4a690e738f5f965533055"} Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.775210 4929 scope.go:117] "RemoveContainer" containerID="af4bae6289883288ef060fcf7511b383a414720f01848ab83c41c01fa5e5aa01" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.775210 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.813026 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.825365 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.837148 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.844098 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:13 crc kubenswrapper[4929]: E1002 11:30:13.844594 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-httpd" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.844617 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-httpd" Oct 02 11:30:13 crc kubenswrapper[4929]: E1002 11:30:13.844650 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-log" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.844657 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-log" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.844839 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-httpd" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.844857 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" containerName="glance-log" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.845815 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.849276 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.849425 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:30:13 crc kubenswrapper[4929]: I1002 11:30:13.851246 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031383 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p468p\" (UniqueName: \"kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031438 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031466 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031490 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031508 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031548 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031565 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.031600 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133274 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133326 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133378 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133395 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133431 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133497 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p468p\" (UniqueName: \"kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133524 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.133608 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.134219 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.134240 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.138300 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.139165 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.139203 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.144781 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.152162 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p468p\" (UniqueName: \"kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.163992 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.169676 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc4a9fe-8ea6-4630-a122-01aad42acf5f" path="/var/lib/kubelet/pods/9dc4a9fe-8ea6-4630-a122-01aad42acf5f/volumes" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.195149 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.737183 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:30:14 crc kubenswrapper[4929]: I1002 11:30:14.737250 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.416808 4929 scope.go:117] "RemoveContainer" containerID="1ce445f72a499a66f5cafe7918d6b74106dabfde8841728aeb0d9cf504fb4cb3" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.520523 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.524837 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.577710 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.663644 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.683149 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrc5d\" (UniqueName: \"kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d\") pod \"f31e4bd6-d3e3-452e-ba3b-40892b9866e3\" (UID: \"f31e4bd6-d3e3-452e-ba3b-40892b9866e3\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.683234 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzjrm\" (UniqueName: \"kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm\") pod \"444e3b63-3bc5-4324-8431-53991c7c7256\" (UID: \"444e3b63-3bc5-4324-8431-53991c7c7256\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.683278 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xjjb\" (UniqueName: \"kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb\") pod \"d193edfe-fe8e-43fb-a328-7018cd7ab38e\" (UID: \"d193edfe-fe8e-43fb-a328-7018cd7ab38e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.699304 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d" (OuterVolumeSpecName: "kube-api-access-wrc5d") pod "f31e4bd6-d3e3-452e-ba3b-40892b9866e3" (UID: "f31e4bd6-d3e3-452e-ba3b-40892b9866e3"). InnerVolumeSpecName "kube-api-access-wrc5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.700202 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm" (OuterVolumeSpecName: "kube-api-access-lzjrm") pod "444e3b63-3bc5-4324-8431-53991c7c7256" (UID: "444e3b63-3bc5-4324-8431-53991c7c7256"). InnerVolumeSpecName "kube-api-access-lzjrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.700308 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb" (OuterVolumeSpecName: "kube-api-access-5xjjb") pod "d193edfe-fe8e-43fb-a328-7018cd7ab38e" (UID: "d193edfe-fe8e-43fb-a328-7018cd7ab38e"). InnerVolumeSpecName "kube-api-access-5xjjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.784436 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.784558 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.784611 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.784634 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdvgr\" (UniqueName: \"kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.785037 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.785119 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys\") pod \"1419c407-f3ab-4214-8d8a-fc76d15b322e\" (UID: \"1419c407-f3ab-4214-8d8a-fc76d15b322e\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.785575 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzjrm\" (UniqueName: \"kubernetes.io/projected/444e3b63-3bc5-4324-8431-53991c7c7256-kube-api-access-lzjrm\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.785600 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xjjb\" (UniqueName: \"kubernetes.io/projected/d193edfe-fe8e-43fb-a328-7018cd7ab38e-kube-api-access-5xjjb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.785615 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrc5d\" (UniqueName: \"kubernetes.io/projected/f31e4bd6-d3e3-452e-ba3b-40892b9866e3-kube-api-access-wrc5d\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.788421 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr" (OuterVolumeSpecName: "kube-api-access-vdvgr") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "kube-api-access-vdvgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.788691 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.789296 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts" (OuterVolumeSpecName: "scripts") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.789917 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.809686 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0fa5-account-create-tnbhh" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.809662 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0fa5-account-create-tnbhh" event={"ID":"444e3b63-3bc5-4324-8431-53991c7c7256","Type":"ContainerDied","Data":"94393cc14624b58cce4a53c25948f7f3c33988765f8fdbe80118d05bdbd998c5"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.809770 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94393cc14624b58cce4a53c25948f7f3c33988765f8fdbe80118d05bdbd998c5" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.811642 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-9b10-account-create-672ps" event={"ID":"d193edfe-fe8e-43fb-a328-7018cd7ab38e","Type":"ContainerDied","Data":"805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.811678 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="805cea5dc7d3e2c3cda5265f49f6d280c1559c67915ef682cae84bf1ee417e0f" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.811659 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-9b10-account-create-672ps" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.818346 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerStarted","Data":"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.820053 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hqscp" event={"ID":"1419c407-f3ab-4214-8d8a-fc76d15b322e","Type":"ContainerDied","Data":"79d1a901f343f31fb92a6ec52d277e1cdba48ded27a1571bec560888f491dd85"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.820076 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d1a901f343f31fb92a6ec52d277e1cdba48ded27a1571bec560888f491dd85" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.820128 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hqscp" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.822393 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.831258 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j8b9x" event={"ID":"d73a449d-0b0f-40a9-9cc7-5e44447b2c86","Type":"ContainerStarted","Data":"4c2c63d65925d087720cb093da6f11e1c0fd2badb401568b1904d09e45b554c8"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.832418 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data" (OuterVolumeSpecName: "config-data") pod "1419c407-f3ab-4214-8d8a-fc76d15b322e" (UID: "1419c407-f3ab-4214-8d8a-fc76d15b322e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.833107 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b2b-account-create-pjqbx" event={"ID":"f31e4bd6-d3e3-452e-ba3b-40892b9866e3","Type":"ContainerDied","Data":"c00a6ff8d4f94536c03af0ab42b3409361ce34ddb1d673c84b1ff944a762c89b"} Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.833130 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00a6ff8d4f94536c03af0ab42b3409361ce34ddb1d673c84b1ff944a762c89b" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.833169 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b2b-account-create-pjqbx" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.849407 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-j8b9x" podStartSLOduration=1.704872166 podStartE2EDuration="9.84938443s" podCreationTimestamp="2025-10-02 11:30:07 +0000 UTC" firstStartedPulling="2025-10-02 11:30:08.283758637 +0000 UTC m=+1208.834124991" lastFinishedPulling="2025-10-02 11:30:16.428270871 +0000 UTC m=+1216.978637255" observedRunningTime="2025-10-02 11:30:16.84676146 +0000 UTC m=+1217.397127824" watchObservedRunningTime="2025-10-02 11:30:16.84938443 +0000 UTC m=+1217.399750794" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.863307 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.889933 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.889995 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.890010 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdvgr\" (UniqueName: \"kubernetes.io/projected/1419c407-f3ab-4214-8d8a-fc76d15b322e-kube-api-access-vdvgr\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.890023 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.890038 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.890050 4929 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1419c407-f3ab-4214-8d8a-fc76d15b322e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991381 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991460 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991488 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991551 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991618 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991674 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkcbk\" (UniqueName: \"kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991724 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.991740 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs\") pod \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\" (UID: \"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a\") " Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.992285 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.992328 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs" (OuterVolumeSpecName: "logs") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.995228 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.996177 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk" (OuterVolumeSpecName: "kube-api-access-rkcbk") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "kube-api-access-rkcbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:16 crc kubenswrapper[4929]: I1002 11:30:16.996975 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts" (OuterVolumeSpecName: "scripts") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.010996 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:30:17 crc kubenswrapper[4929]: W1002 11:30:17.012517 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f4d876_4669_4a1f_b3c4_90f39131f726.slice/crio-f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928 WatchSource:0}: Error finding container f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928: Status 404 returned error can't find the container with id f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928 Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.028059 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.044169 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.046319 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data" (OuterVolumeSpecName: "config-data") pod "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" (UID: "9b993d0d-51d0-4ead-82d1-ef7a7f22a65a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094434 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094468 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094479 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094508 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094519 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094529 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkcbk\" (UniqueName: \"kubernetes.io/projected/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-kube-api-access-rkcbk\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094538 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.094547 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.111867 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.196334 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.571273 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.632049 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.636149 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="dnsmasq-dns" containerID="cri-o://f1c16475ebb3a9d3bb20a389c8c2b5741fb8a4caea85c6a0a5e5da18a1e2a31d" gracePeriod=10 Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.760419 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hqscp"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.776461 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hqscp"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.862381 4929 generic.go:334] "Generic (PLEG): container finished" podID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerID="f1c16475ebb3a9d3bb20a389c8c2b5741fb8a4caea85c6a0a5e5da18a1e2a31d" exitCode=0 Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.862443 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" event={"ID":"37b19762-0589-4167-8a91-b0ccd22fefdf","Type":"ContainerDied","Data":"f1c16475ebb3a9d3bb20a389c8c2b5741fb8a4caea85c6a0a5e5da18a1e2a31d"} Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.862563 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6hhg9"] Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.862972 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-log" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.862990 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-log" Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.863008 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1419c407-f3ab-4214-8d8a-fc76d15b322e" containerName="keystone-bootstrap" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863016 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1419c407-f3ab-4214-8d8a-fc76d15b322e" containerName="keystone-bootstrap" Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.863031 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-httpd" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863040 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-httpd" Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.863056 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d193edfe-fe8e-43fb-a328-7018cd7ab38e" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863064 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d193edfe-fe8e-43fb-a328-7018cd7ab38e" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.863084 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444e3b63-3bc5-4324-8431-53991c7c7256" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863092 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="444e3b63-3bc5-4324-8431-53991c7c7256" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: E1002 11:30:17.863115 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31e4bd6-d3e3-452e-ba3b-40892b9866e3" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863122 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31e4bd6-d3e3-452e-ba3b-40892b9866e3" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863361 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-log" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863374 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d193edfe-fe8e-43fb-a328-7018cd7ab38e" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863384 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" containerName="glance-httpd" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863396 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31e4bd6-d3e3-452e-ba3b-40892b9866e3" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863413 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1419c407-f3ab-4214-8d8a-fc76d15b322e" containerName="keystone-bootstrap" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.863426 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="444e3b63-3bc5-4324-8431-53991c7c7256" containerName="mariadb-account-create" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.864023 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.866027 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.866027 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.866382 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9qdpv" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.866389 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.867877 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b993d0d-51d0-4ead-82d1-ef7a7f22a65a","Type":"ContainerDied","Data":"bc9e4be6d1bebab791c91c5b78aabc7aac6e600d4a137d1d59fe8f35318ecef2"} Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.867923 4929 scope.go:117] "RemoveContainer" containerID="1e1e65801c95faacc5b36b41bc4d34fd8d908ef400a227beb6b3e838a8c71216" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.868153 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.875470 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerStarted","Data":"33d493d22c989aac977ca56b30eaa4751bee9cf5bc3d357763c0d2fbfd6345c3"} Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.875521 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerStarted","Data":"f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928"} Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.890268 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6hhg9"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.919264 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.927735 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.934517 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.937197 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.942301 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.942551 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:30:17 crc kubenswrapper[4929]: I1002 11:30:17.952114 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.023566 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.023608 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.023869 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.023941 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlg6\" (UniqueName: \"kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.024044 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.024131 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.107274 4929 scope.go:117] "RemoveContainer" containerID="353d9fdda9672c972c70396fa7e1a0243c8d9d81256a02028bbd562ca2f71d06" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.126516 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.126855 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.126893 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.126975 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127007 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127060 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127085 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127113 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlg6\" (UniqueName: \"kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127153 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127182 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgwfj\" (UniqueName: \"kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127212 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127249 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127274 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.127310 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.132160 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.132446 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.133703 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.133855 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.135854 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.148894 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlg6\" (UniqueName: \"kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6\") pod \"keystone-bootstrap-6hhg9\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.171553 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1419c407-f3ab-4214-8d8a-fc76d15b322e" path="/var/lib/kubelet/pods/1419c407-f3ab-4214-8d8a-fc76d15b322e/volumes" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.172365 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b993d0d-51d0-4ead-82d1-ef7a7f22a65a" path="/var/lib/kubelet/pods/9b993d0d-51d0-4ead-82d1-ef7a7f22a65a/volumes" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.197126 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228691 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228770 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228789 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228822 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228848 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228869 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgwfj\" (UniqueName: \"kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228900 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.228925 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.231454 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.231676 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.235602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.236419 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.237723 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.241065 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.241932 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.252067 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgwfj\" (UniqueName: \"kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.277744 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.531418 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.572593 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.638860 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config\") pod \"37b19762-0589-4167-8a91-b0ccd22fefdf\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.638911 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb\") pod \"37b19762-0589-4167-8a91-b0ccd22fefdf\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.639061 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psplz\" (UniqueName: \"kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz\") pod \"37b19762-0589-4167-8a91-b0ccd22fefdf\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.639101 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb\") pod \"37b19762-0589-4167-8a91-b0ccd22fefdf\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.639128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc\") pod \"37b19762-0589-4167-8a91-b0ccd22fefdf\" (UID: \"37b19762-0589-4167-8a91-b0ccd22fefdf\") " Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.644237 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz" (OuterVolumeSpecName: "kube-api-access-psplz") pod "37b19762-0589-4167-8a91-b0ccd22fefdf" (UID: "37b19762-0589-4167-8a91-b0ccd22fefdf"). InnerVolumeSpecName "kube-api-access-psplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.685974 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config" (OuterVolumeSpecName: "config") pod "37b19762-0589-4167-8a91-b0ccd22fefdf" (UID: "37b19762-0589-4167-8a91-b0ccd22fefdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.706872 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6hhg9"] Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.707238 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37b19762-0589-4167-8a91-b0ccd22fefdf" (UID: "37b19762-0589-4167-8a91-b0ccd22fefdf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.709138 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37b19762-0589-4167-8a91-b0ccd22fefdf" (UID: "37b19762-0589-4167-8a91-b0ccd22fefdf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.714336 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37b19762-0589-4167-8a91-b0ccd22fefdf" (UID: "37b19762-0589-4167-8a91-b0ccd22fefdf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.741885 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.741918 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.741932 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psplz\" (UniqueName: \"kubernetes.io/projected/37b19762-0589-4167-8a91-b0ccd22fefdf-kube-api-access-psplz\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.741944 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.741955 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37b19762-0589-4167-8a91-b0ccd22fefdf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.890319 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" event={"ID":"37b19762-0589-4167-8a91-b0ccd22fefdf","Type":"ContainerDied","Data":"1263ae6459bc03d5e1eb29a78c1a8da170c4bea203f23764ef6436d116e7ec86"} Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.890351 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-v89pl" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.890366 4929 scope.go:117] "RemoveContainer" containerID="f1c16475ebb3a9d3bb20a389c8c2b5741fb8a4caea85c6a0a5e5da18a1e2a31d" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.895606 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6hhg9" event={"ID":"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605","Type":"ContainerStarted","Data":"8d172c9a74f28d18513865e95dba01ad90e7c60fe0351a9e85fe0fd765248f1d"} Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.915000 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerStarted","Data":"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52"} Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.918123 4929 generic.go:334] "Generic (PLEG): container finished" podID="d73a449d-0b0f-40a9-9cc7-5e44447b2c86" containerID="4c2c63d65925d087720cb093da6f11e1c0fd2badb401568b1904d09e45b554c8" exitCode=0 Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.918154 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j8b9x" event={"ID":"d73a449d-0b0f-40a9-9cc7-5e44447b2c86","Type":"ContainerDied","Data":"4c2c63d65925d087720cb093da6f11e1c0fd2badb401568b1904d09e45b554c8"} Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.923566 4929 scope.go:117] "RemoveContainer" containerID="dedfaab2c2b9cc2cd612bc15f496e7f441b45b7f698ec0232ceba814f30d469e" Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.969622 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:30:18 crc kubenswrapper[4929]: I1002 11:30:18.973799 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-v89pl"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.162342 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:30:19 crc kubenswrapper[4929]: W1002 11:30:19.184046 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod870cd509_a882_46d4_ae9d_c89ee843385a.slice/crio-96cc5772dd3a996bc587e15ddbf1127a1e2b232cc033154460387f1b4dcc5c2f WatchSource:0}: Error finding container 96cc5772dd3a996bc587e15ddbf1127a1e2b232cc033154460387f1b4dcc5c2f: Status 404 returned error can't find the container with id 96cc5772dd3a996bc587e15ddbf1127a1e2b232cc033154460387f1b4dcc5c2f Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.486641 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zr8dn"] Oct 02 11:30:19 crc kubenswrapper[4929]: E1002 11:30:19.486991 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="dnsmasq-dns" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.487015 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="dnsmasq-dns" Oct 02 11:30:19 crc kubenswrapper[4929]: E1002 11:30:19.487050 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="init" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.487057 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="init" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.487218 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" containerName="dnsmasq-dns" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.487724 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.490163 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qpp2l" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.491762 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.491988 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.499780 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zr8dn"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661115 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661426 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661489 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661514 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661536 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.661564 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8g5\" (UniqueName: \"kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.738222 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8rg59"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.739370 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.741814 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wwzzc" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.741814 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.768983 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.769042 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.769071 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.769095 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8g5\" (UniqueName: \"kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.769161 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.769199 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.771927 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.781321 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.786917 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.788589 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.792625 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8rg59"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.816417 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.816497 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8g5\" (UniqueName: \"kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5\") pod \"cinder-db-sync-zr8dn\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.880326 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74p9l\" (UniqueName: \"kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.880812 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.881015 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.893024 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qlnn8"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.894261 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.896223 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f2kgs" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.896590 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.897019 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.913220 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qlnn8"] Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.938013 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6hhg9" event={"ID":"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605","Type":"ContainerStarted","Data":"c3d54d33154267067894be6f6bf383c4dd38b0d264f89c53eca1cce53b8f5f42"} Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.943107 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerStarted","Data":"96cc5772dd3a996bc587e15ddbf1127a1e2b232cc033154460387f1b4dcc5c2f"} Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.945640 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerStarted","Data":"bf1df801c33923c52b57d70ffbeb276141d6b88cfc7248fef720e8247b706bf1"} Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.960581 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6hhg9" podStartSLOduration=2.960557031 podStartE2EDuration="2.960557031s" podCreationTimestamp="2025-10-02 11:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:19.956695717 +0000 UTC m=+1220.507062101" watchObservedRunningTime="2025-10-02 11:30:19.960557031 +0000 UTC m=+1220.510923395" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.981389 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.981373781 podStartE2EDuration="6.981373781s" podCreationTimestamp="2025-10-02 11:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:19.977899667 +0000 UTC m=+1220.528266031" watchObservedRunningTime="2025-10-02 11:30:19.981373781 +0000 UTC m=+1220.531740145" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.983776 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.983854 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.983917 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74p9l\" (UniqueName: \"kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.989144 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:19 crc kubenswrapper[4929]: I1002 11:30:19.999546 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74p9l\" (UniqueName: \"kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.009645 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data\") pod \"barbican-db-sync-8rg59\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.080719 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.085222 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lljmz\" (UniqueName: \"kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.085368 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.085404 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.111756 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.170881 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b19762-0589-4167-8a91-b0ccd22fefdf" path="/var/lib/kubelet/pods/37b19762-0589-4167-8a91-b0ccd22fefdf/volumes" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.188807 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.188857 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.188937 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lljmz\" (UniqueName: \"kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.194682 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.199460 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.212484 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lljmz\" (UniqueName: \"kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz\") pod \"neutron-db-sync-qlnn8\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.228911 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.236860 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.391873 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle\") pod \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.392183 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data\") pod \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.392267 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs\") pod \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.392290 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2lrg\" (UniqueName: \"kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg\") pod \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.392322 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts\") pod \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\" (UID: \"d73a449d-0b0f-40a9-9cc7-5e44447b2c86\") " Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.393249 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs" (OuterVolumeSpecName: "logs") pod "d73a449d-0b0f-40a9-9cc7-5e44447b2c86" (UID: "d73a449d-0b0f-40a9-9cc7-5e44447b2c86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.411229 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts" (OuterVolumeSpecName: "scripts") pod "d73a449d-0b0f-40a9-9cc7-5e44447b2c86" (UID: "d73a449d-0b0f-40a9-9cc7-5e44447b2c86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.415696 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg" (OuterVolumeSpecName: "kube-api-access-q2lrg") pod "d73a449d-0b0f-40a9-9cc7-5e44447b2c86" (UID: "d73a449d-0b0f-40a9-9cc7-5e44447b2c86"). InnerVolumeSpecName "kube-api-access-q2lrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.446244 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d73a449d-0b0f-40a9-9cc7-5e44447b2c86" (UID: "d73a449d-0b0f-40a9-9cc7-5e44447b2c86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493091 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data" (OuterVolumeSpecName: "config-data") pod "d73a449d-0b0f-40a9-9cc7-5e44447b2c86" (UID: "d73a449d-0b0f-40a9-9cc7-5e44447b2c86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493781 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493838 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493848 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493857 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.493867 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2lrg\" (UniqueName: \"kubernetes.io/projected/d73a449d-0b0f-40a9-9cc7-5e44447b2c86-kube-api-access-q2lrg\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.633029 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zr8dn"] Oct 02 11:30:20 crc kubenswrapper[4929]: W1002 11:30:20.645900 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69bfcad7_d630_4361_b28d_f072ac3f84a0.slice/crio-9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3 WatchSource:0}: Error finding container 9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3: Status 404 returned error can't find the container with id 9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3 Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.715030 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8rg59"] Oct 02 11:30:20 crc kubenswrapper[4929]: W1002 11:30:20.728164 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b3c946_3ee6_4320_ab89_6fb932ec3292.slice/crio-609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b WatchSource:0}: Error finding container 609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b: Status 404 returned error can't find the container with id 609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.842273 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qlnn8"] Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.973293 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8rg59" event={"ID":"d1b3c946-3ee6-4320-ab89-6fb932ec3292","Type":"ContainerStarted","Data":"609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b"} Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.975635 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zr8dn" event={"ID":"69bfcad7-d630-4361-b28d-f072ac3f84a0","Type":"ContainerStarted","Data":"9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3"} Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.979870 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-j8b9x" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.980514 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-j8b9x" event={"ID":"d73a449d-0b0f-40a9-9cc7-5e44447b2c86","Type":"ContainerDied","Data":"989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108"} Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.980603 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="989bdb5c536b88edf9b384461607ac3a2e42f9f9b7a3f125b58e18c0e34c7108" Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.984816 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerStarted","Data":"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7"} Oct 02 11:30:20 crc kubenswrapper[4929]: I1002 11:30:20.984855 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerStarted","Data":"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1"} Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.080926 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.080910732 podStartE2EDuration="4.080910732s" podCreationTimestamp="2025-10-02 11:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:21.006524131 +0000 UTC m=+1221.556890515" watchObservedRunningTime="2025-10-02 11:30:21.080910732 +0000 UTC m=+1221.631277096" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.087013 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:30:21 crc kubenswrapper[4929]: E1002 11:30:21.087378 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73a449d-0b0f-40a9-9cc7-5e44447b2c86" containerName="placement-db-sync" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.087395 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73a449d-0b0f-40a9-9cc7-5e44447b2c86" containerName="placement-db-sync" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.087552 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73a449d-0b0f-40a9-9cc7-5e44447b2c86" containerName="placement-db-sync" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.089482 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.095551 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.095775 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.095982 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.096003 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.096063 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gt6zf" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.141801 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244403 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244478 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244534 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244627 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244716 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2kmp\" (UniqueName: \"kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244803 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.244844 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.346898 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2kmp\" (UniqueName: \"kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347010 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347032 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347077 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347128 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347156 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347220 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.347699 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.353188 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.353348 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.353628 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.353883 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.368874 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.372988 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2kmp\" (UniqueName: \"kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp\") pod \"placement-99d9d588b-ddwr8\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:21 crc kubenswrapper[4929]: I1002 11:30:21.465828 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:22 crc kubenswrapper[4929]: I1002 11:30:22.025690 4929 generic.go:334] "Generic (PLEG): container finished" podID="c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" containerID="c3d54d33154267067894be6f6bf383c4dd38b0d264f89c53eca1cce53b8f5f42" exitCode=0 Oct 02 11:30:22 crc kubenswrapper[4929]: I1002 11:30:22.025987 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6hhg9" event={"ID":"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605","Type":"ContainerDied","Data":"c3d54d33154267067894be6f6bf383c4dd38b0d264f89c53eca1cce53b8f5f42"} Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.519186 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.582737 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvlg6\" (UniqueName: \"kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.582840 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.582897 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.582947 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.582985 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.583006 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys\") pod \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\" (UID: \"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605\") " Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.593389 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.593500 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts" (OuterVolumeSpecName: "scripts") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.593535 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.593835 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6" (OuterVolumeSpecName: "kube-api-access-cvlg6") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "kube-api-access-cvlg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.613580 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data" (OuterVolumeSpecName: "config-data") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.617806 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" (UID: "c44dff1a-4b7b-4bf0-aaab-26ed02bb1605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684825 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvlg6\" (UniqueName: \"kubernetes.io/projected/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-kube-api-access-cvlg6\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684861 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684875 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684886 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684898 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.684910 4929 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:23 crc kubenswrapper[4929]: I1002 11:30:23.839723 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:30:23 crc kubenswrapper[4929]: W1002 11:30:23.849569 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b67fd7d_2814_4efd_ad06_ee8283104d49.slice/crio-e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de WatchSource:0}: Error finding container e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de: Status 404 returned error can't find the container with id e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.068931 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlnn8" event={"ID":"892a7315-3f0a-4523-9c05-3a9a8ca321b5","Type":"ContainerStarted","Data":"28bd669c66fa526e77aca54c489ecfc6d4c91a615c3e8bf12eb5abf47179648c"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.069000 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlnn8" event={"ID":"892a7315-3f0a-4523-9c05-3a9a8ca321b5","Type":"ContainerStarted","Data":"fb31a8bf26fcd09db047c2e1c9916e1f4bfe8d3e19e7b3339a50269aa8b9c02b"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.078232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6hhg9" event={"ID":"c44dff1a-4b7b-4bf0-aaab-26ed02bb1605","Type":"ContainerDied","Data":"8d172c9a74f28d18513865e95dba01ad90e7c60fe0351a9e85fe0fd765248f1d"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.078282 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d172c9a74f28d18513865e95dba01ad90e7c60fe0351a9e85fe0fd765248f1d" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.078290 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6hhg9" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.084295 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerStarted","Data":"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.086770 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerStarted","Data":"fbb7ed5b03ef1c144f9d8326fca0faa2b3ea252a058e454c2f5d13b47dbb1af1"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.086825 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerStarted","Data":"e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de"} Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.090417 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qlnn8" podStartSLOduration=5.090403478 podStartE2EDuration="5.090403478s" podCreationTimestamp="2025-10-02 11:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:24.088288101 +0000 UTC m=+1224.638654485" watchObservedRunningTime="2025-10-02 11:30:24.090403478 +0000 UTC m=+1224.640769842" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.133358 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:30:24 crc kubenswrapper[4929]: E1002 11:30:24.134914 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" containerName="keystone-bootstrap" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.135003 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" containerName="keystone-bootstrap" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.135238 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" containerName="keystone-bootstrap" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.135866 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.144929 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.147542 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9qdpv" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.147775 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.148190 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.152167 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.154315 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.177153 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.195811 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.195885 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.254294 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.286601 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294009 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294050 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294151 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294171 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294195 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294252 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294297 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gztcl\" (UniqueName: \"kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.294318 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396091 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396378 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gztcl\" (UniqueName: \"kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396404 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396457 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396473 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396520 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396538 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.396562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.400154 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.402401 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.402708 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.402906 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.403554 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.406421 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.407153 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.412793 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gztcl\" (UniqueName: \"kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl\") pod \"keystone-7bd786b699-2sf9r\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.463407 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:24 crc kubenswrapper[4929]: I1002 11:30:24.799152 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.102798 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerStarted","Data":"157276ef9d3a545d2f5ce4288c1bab5d100b24eab57de2c4e08c2b13bd82b387"} Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.104822 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.104881 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.113285 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd786b699-2sf9r" event={"ID":"c89c2414-cee5-46e9-9284-cd96fb472fd7","Type":"ContainerStarted","Data":"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3"} Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.113333 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.113347 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd786b699-2sf9r" event={"ID":"c89c2414-cee5-46e9-9284-cd96fb472fd7","Type":"ContainerStarted","Data":"09ac7038040fd303886c5015104dd3d2ff0fd294f873fef0596991d6d8779ae3"} Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.113365 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.113855 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:30:25 crc kubenswrapper[4929]: I1002 11:30:25.128544 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-99d9d588b-ddwr8" podStartSLOduration=4.128528757 podStartE2EDuration="4.128528757s" podCreationTimestamp="2025-10-02 11:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:25.121254531 +0000 UTC m=+1225.671620895" watchObservedRunningTime="2025-10-02 11:30:25.128528757 +0000 UTC m=+1225.678895121" Oct 02 11:30:27 crc kubenswrapper[4929]: I1002 11:30:27.130203 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:30:27 crc kubenswrapper[4929]: I1002 11:30:27.130764 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:30:27 crc kubenswrapper[4929]: I1002 11:30:27.226696 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:30:27 crc kubenswrapper[4929]: I1002 11:30:27.233161 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:30:27 crc kubenswrapper[4929]: I1002 11:30:27.254722 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bd786b699-2sf9r" podStartSLOduration=3.254707968 podStartE2EDuration="3.254707968s" podCreationTimestamp="2025-10-02 11:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:25.144655971 +0000 UTC m=+1225.695022335" watchObservedRunningTime="2025-10-02 11:30:27.254707968 +0000 UTC m=+1227.805074332" Oct 02 11:30:28 crc kubenswrapper[4929]: I1002 11:30:28.573398 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:28 crc kubenswrapper[4929]: I1002 11:30:28.574532 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:28 crc kubenswrapper[4929]: I1002 11:30:28.604532 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:28 crc kubenswrapper[4929]: I1002 11:30:28.625052 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:29 crc kubenswrapper[4929]: I1002 11:30:29.145347 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:29 crc kubenswrapper[4929]: I1002 11:30:29.145381 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:31 crc kubenswrapper[4929]: I1002 11:30:31.038245 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:31 crc kubenswrapper[4929]: I1002 11:30:31.041989 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:30:41 crc kubenswrapper[4929]: E1002 11:30:41.131694 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 02 11:30:41 crc kubenswrapper[4929]: E1002 11:30:41.132527 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvv5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 11:30:41 crc kubenswrapper[4929]: E1002 11:30:41.133750 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" Oct 02 11:30:41 crc kubenswrapper[4929]: I1002 11:30:41.246674 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-central-agent" containerID="cri-o://e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c" gracePeriod=30 Oct 02 11:30:41 crc kubenswrapper[4929]: I1002 11:30:41.246719 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="sg-core" containerID="cri-o://d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917" gracePeriod=30 Oct 02 11:30:41 crc kubenswrapper[4929]: I1002 11:30:41.246743 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-notification-agent" containerID="cri-o://d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52" gracePeriod=30 Oct 02 11:30:42 crc kubenswrapper[4929]: E1002 11:30:42.054459 4929 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 02 11:30:42 crc kubenswrapper[4929]: E1002 11:30:42.054661 4929 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4s8g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zr8dn_openstack(69bfcad7-d630-4361-b28d-f072ac3f84a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:30:42 crc kubenswrapper[4929]: E1002 11:30:42.056718 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zr8dn" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.270047 4929 generic.go:334] "Generic (PLEG): container finished" podID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerID="d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917" exitCode=2 Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.271343 4929 generic.go:334] "Generic (PLEG): container finished" podID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerID="e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c" exitCode=0 Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.270144 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerDied","Data":"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917"} Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.271552 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerDied","Data":"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c"} Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.273681 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8rg59" event={"ID":"d1b3c946-3ee6-4320-ab89-6fb932ec3292","Type":"ContainerStarted","Data":"24faef81b6cede3d67c9462380194d13406ef6780e10159894ee6b2ef0ce25e9"} Oct 02 11:30:42 crc kubenswrapper[4929]: E1002 11:30:42.275022 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zr8dn" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" Oct 02 11:30:42 crc kubenswrapper[4929]: I1002 11:30:42.291325 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8rg59" podStartSLOduration=1.9954628140000001 podStartE2EDuration="23.291302073s" podCreationTimestamp="2025-10-02 11:30:19 +0000 UTC" firstStartedPulling="2025-10-02 11:30:20.7314081 +0000 UTC m=+1221.281774464" lastFinishedPulling="2025-10-02 11:30:42.027247359 +0000 UTC m=+1242.577613723" observedRunningTime="2025-10-02 11:30:42.290306656 +0000 UTC m=+1242.840673030" watchObservedRunningTime="2025-10-02 11:30:42.291302073 +0000 UTC m=+1242.841668437" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.854656 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953117 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953541 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953568 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953665 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953740 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvv5m\" (UniqueName: \"kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953793 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.953817 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle\") pod \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\" (UID: \"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8\") " Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.954134 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.954320 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.954432 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.960148 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m" (OuterVolumeSpecName: "kube-api-access-wvv5m") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "kube-api-access-wvv5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.960645 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts" (OuterVolumeSpecName: "scripts") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:43 crc kubenswrapper[4929]: I1002 11:30:43.987519 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.022837 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.023020 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data" (OuterVolumeSpecName: "config-data") pod "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" (UID: "fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055675 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvv5m\" (UniqueName: \"kubernetes.io/projected/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-kube-api-access-wvv5m\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055734 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055753 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055793 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055809 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.055825 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.295032 4929 generic.go:334] "Generic (PLEG): container finished" podID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerID="d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52" exitCode=0 Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.295124 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerDied","Data":"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52"} Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.295151 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.295567 4929 scope.go:117] "RemoveContainer" containerID="d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.295471 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8","Type":"ContainerDied","Data":"bfab6c0655342def95d494c6d46a7dfedf508f550ec4fd68eb90927ff2593dbb"} Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.315868 4929 scope.go:117] "RemoveContainer" containerID="d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.354705 4929 scope.go:117] "RemoveContainer" containerID="e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.379726 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.389445 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.395211 4929 scope.go:117] "RemoveContainer" containerID="d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917" Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.395824 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917\": container with ID starting with d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917 not found: ID does not exist" containerID="d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.395922 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917"} err="failed to get container status \"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917\": rpc error: code = NotFound desc = could not find container \"d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917\": container with ID starting with d26a90b47c27ce6317c9097e176a970b0478a9f2e9311b163e560ad18f5a6917 not found: ID does not exist" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.396027 4929 scope.go:117] "RemoveContainer" containerID="d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52" Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.396426 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52\": container with ID starting with d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52 not found: ID does not exist" containerID="d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.396457 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52"} err="failed to get container status \"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52\": rpc error: code = NotFound desc = could not find container \"d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52\": container with ID starting with d5dba9e922f4db6828658bb7cf36cc5e4d8ce605de8818058d6fcd5014561b52 not found: ID does not exist" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.396477 4929 scope.go:117] "RemoveContainer" containerID="e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c" Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.396982 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c\": container with ID starting with e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c not found: ID does not exist" containerID="e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.397088 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c"} err="failed to get container status \"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c\": rpc error: code = NotFound desc = could not find container \"e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c\": container with ID starting with e285f01866169b57d1b693cf4f5c35f0a7cd35c932548d82749ba89876966c5c not found: ID does not exist" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.399613 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.400147 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="sg-core" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400171 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="sg-core" Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.400204 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-central-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400213 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-central-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: E1002 11:30:44.400225 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-notification-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400235 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-notification-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400462 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="sg-core" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400478 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-central-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.400489 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" containerName="ceilometer-notification-agent" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.402785 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.405777 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.405784 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.410104 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463760 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463847 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463867 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463899 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4v5v\" (UniqueName: \"kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463928 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.463986 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.464077 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566016 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566171 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566216 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4v5v\" (UniqueName: \"kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566251 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566285 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.566375 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.567200 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.567615 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.571014 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.571436 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.572148 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.573420 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.589279 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4v5v\" (UniqueName: \"kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v\") pod \"ceilometer-0\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.723078 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.738315 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:30:44 crc kubenswrapper[4929]: I1002 11:30:44.738387 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:30:45 crc kubenswrapper[4929]: I1002 11:30:45.153439 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:30:45 crc kubenswrapper[4929]: W1002 11:30:45.157358 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb9e992_28b7_4e84_acd9_6022cbaced5e.slice/crio-b762613924f74d92b50fecfef1e6b05937644de4a3d6381adfb5b81254eed4c6 WatchSource:0}: Error finding container b762613924f74d92b50fecfef1e6b05937644de4a3d6381adfb5b81254eed4c6: Status 404 returned error can't find the container with id b762613924f74d92b50fecfef1e6b05937644de4a3d6381adfb5b81254eed4c6 Oct 02 11:30:45 crc kubenswrapper[4929]: I1002 11:30:45.305325 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerStarted","Data":"b762613924f74d92b50fecfef1e6b05937644de4a3d6381adfb5b81254eed4c6"} Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.168556 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8" path="/var/lib/kubelet/pods/fbcfe473-f5aa-424c-bcdb-ef3fa0c4ada8/volumes" Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.316021 4929 generic.go:334] "Generic (PLEG): container finished" podID="892a7315-3f0a-4523-9c05-3a9a8ca321b5" containerID="28bd669c66fa526e77aca54c489ecfc6d4c91a615c3e8bf12eb5abf47179648c" exitCode=0 Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.316117 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlnn8" event={"ID":"892a7315-3f0a-4523-9c05-3a9a8ca321b5","Type":"ContainerDied","Data":"28bd669c66fa526e77aca54c489ecfc6d4c91a615c3e8bf12eb5abf47179648c"} Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.317717 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerStarted","Data":"74039636db2c6dff50014462c615eaa09b841fcc8d33ff434d4ea85ffda62f92"} Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.319594 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1b3c946-3ee6-4320-ab89-6fb932ec3292" containerID="24faef81b6cede3d67c9462380194d13406ef6780e10159894ee6b2ef0ce25e9" exitCode=0 Oct 02 11:30:46 crc kubenswrapper[4929]: I1002 11:30:46.319636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8rg59" event={"ID":"d1b3c946-3ee6-4320-ab89-6fb932ec3292","Type":"ContainerDied","Data":"24faef81b6cede3d67c9462380194d13406ef6780e10159894ee6b2ef0ce25e9"} Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.335941 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerStarted","Data":"ba736854a52194f11e467bacf9703ab3db3238818aad51ed6a333fa0cc04d412"} Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.612155 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.624985 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.724937 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle\") pod \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.725128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74p9l\" (UniqueName: \"kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l\") pod \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.725179 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lljmz\" (UniqueName: \"kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz\") pod \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.725217 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle\") pod \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.725236 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config\") pod \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\" (UID: \"892a7315-3f0a-4523-9c05-3a9a8ca321b5\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.725299 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data\") pod \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\" (UID: \"d1b3c946-3ee6-4320-ab89-6fb932ec3292\") " Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.729178 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d1b3c946-3ee6-4320-ab89-6fb932ec3292" (UID: "d1b3c946-3ee6-4320-ab89-6fb932ec3292"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.729251 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz" (OuterVolumeSpecName: "kube-api-access-lljmz") pod "892a7315-3f0a-4523-9c05-3a9a8ca321b5" (UID: "892a7315-3f0a-4523-9c05-3a9a8ca321b5"). InnerVolumeSpecName "kube-api-access-lljmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.730811 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l" (OuterVolumeSpecName: "kube-api-access-74p9l") pod "d1b3c946-3ee6-4320-ab89-6fb932ec3292" (UID: "d1b3c946-3ee6-4320-ab89-6fb932ec3292"). InnerVolumeSpecName "kube-api-access-74p9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.747113 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config" (OuterVolumeSpecName: "config") pod "892a7315-3f0a-4523-9c05-3a9a8ca321b5" (UID: "892a7315-3f0a-4523-9c05-3a9a8ca321b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.747320 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1b3c946-3ee6-4320-ab89-6fb932ec3292" (UID: "d1b3c946-3ee6-4320-ab89-6fb932ec3292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.748857 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "892a7315-3f0a-4523-9c05-3a9a8ca321b5" (UID: "892a7315-3f0a-4523-9c05-3a9a8ca321b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828090 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74p9l\" (UniqueName: \"kubernetes.io/projected/d1b3c946-3ee6-4320-ab89-6fb932ec3292-kube-api-access-74p9l\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828120 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lljmz\" (UniqueName: \"kubernetes.io/projected/892a7315-3f0a-4523-9c05-3a9a8ca321b5-kube-api-access-lljmz\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828130 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828139 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828148 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b3c946-3ee6-4320-ab89-6fb932ec3292-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:47 crc kubenswrapper[4929]: I1002 11:30:47.828156 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892a7315-3f0a-4523-9c05-3a9a8ca321b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.350242 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerStarted","Data":"99dbf9e33c9c9434c7a7136f2e279fea82b921758a2405426dbc1a3112f2ec84"} Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.353443 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8rg59" event={"ID":"d1b3c946-3ee6-4320-ab89-6fb932ec3292","Type":"ContainerDied","Data":"609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b"} Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.353477 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="609844dea914367db2a7a0b88a6af54a49a1f00bf9137e9ee8166a2328c7c23b" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.353540 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8rg59" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.358177 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlnn8" event={"ID":"892a7315-3f0a-4523-9c05-3a9a8ca321b5","Type":"ContainerDied","Data":"fb31a8bf26fcd09db047c2e1c9916e1f4bfe8d3e19e7b3339a50269aa8b9c02b"} Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.358229 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb31a8bf26fcd09db047c2e1c9916e1f4bfe8d3e19e7b3339a50269aa8b9c02b" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.358374 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlnn8" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.518277 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-l6vqp"] Oct 02 11:30:48 crc kubenswrapper[4929]: E1002 11:30:48.518720 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b3c946-3ee6-4320-ab89-6fb932ec3292" containerName="barbican-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.518747 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b3c946-3ee6-4320-ab89-6fb932ec3292" containerName="barbican-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: E1002 11:30:48.518788 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892a7315-3f0a-4523-9c05-3a9a8ca321b5" containerName="neutron-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.518799 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a7315-3f0a-4523-9c05-3a9a8ca321b5" containerName="neutron-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.519031 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b3c946-3ee6-4320-ab89-6fb932ec3292" containerName="barbican-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.519054 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="892a7315-3f0a-4523-9c05-3a9a8ca321b5" containerName="neutron-db-sync" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.520178 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.550024 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-l6vqp"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.593342 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.594974 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.598644 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wwzzc" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.598990 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.599219 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.632903 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.643694 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.643969 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644183 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644307 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644378 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644465 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644542 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644610 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvtnb\" (UniqueName: \"kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt5j\" (UniqueName: \"kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644968 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.644985 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.647313 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.648829 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.652110 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.684659 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.686539 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.690526 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.690628 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f2kgs" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.690731 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.690911 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.696467 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.722535 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753011 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753077 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgv6t\" (UniqueName: \"kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753103 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753139 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jts64\" (UniqueName: \"kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753157 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753220 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753243 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753268 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753301 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753374 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753396 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753411 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753427 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvtnb\" (UniqueName: \"kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753442 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwt5j\" (UniqueName: \"kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753481 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753521 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753540 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753554 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753578 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753615 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.753634 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.754538 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.754903 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.755153 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.755698 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.758446 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.758708 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.760156 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.761319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.765975 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-l6vqp"] Oct 02 11:30:48 crc kubenswrapper[4929]: E1002 11:30:48.766624 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[dns-swift-storage-0 kube-api-access-rvtnb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" podUID="303bfc40-7c6c-4c15-97ae-4efcd8047d90" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.772842 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.775156 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvtnb\" (UniqueName: \"kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb\") pod \"dnsmasq-dns-6b7b667979-l6vqp\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.790639 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwt5j\" (UniqueName: \"kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j\") pod \"barbican-worker-6f9b7d8ff7-88gb5\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.814510 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.816323 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.822953 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.824534 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.828174 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.834107 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.839171 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855430 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855472 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855489 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855529 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855552 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgv6t\" (UniqueName: \"kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855569 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jts64\" (UniqueName: \"kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855609 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855647 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855669 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.855701 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.856101 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.865362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.865848 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.867602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.868743 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.869245 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.879947 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.882056 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.896600 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jts64\" (UniqueName: \"kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64\") pod \"barbican-keystone-listener-655677957d-l5jzm\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.906204 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgv6t\" (UniqueName: \"kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t\") pod \"neutron-6d65bddd44-jz54h\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.952417 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956766 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956826 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956870 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956887 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956908 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956934 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxhtw\" (UniqueName: \"kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.956994 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.957012 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj9qw\" (UniqueName: \"kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.957034 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.957078 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.957117 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:48 crc kubenswrapper[4929]: I1002 11:30:48.982660 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.023752 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.059851 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.059910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.059949 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxhtw\" (UniqueName: \"kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060021 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj9qw\" (UniqueName: \"kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060075 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060119 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060175 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060209 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060226 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.060259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.061050 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.061641 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.062211 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.068249 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.068576 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.068875 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.071229 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.071988 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.081443 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.087021 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxhtw\" (UniqueName: \"kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw\") pod \"dnsmasq-dns-848cf88cfc-rsr95\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.092563 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj9qw\" (UniqueName: \"kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw\") pod \"barbican-api-69dbd5cc54-74gz4\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.155356 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.254388 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.370126 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.394119 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466532 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466590 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466667 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466737 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466802 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvtnb\" (UniqueName: \"kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.466821 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0\") pod \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\" (UID: \"303bfc40-7c6c-4c15-97ae-4efcd8047d90\") " Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.467601 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.467920 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.468314 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config" (OuterVolumeSpecName: "config") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.468416 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.468638 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.475170 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb" (OuterVolumeSpecName: "kube-api-access-rvtnb") pod "303bfc40-7c6c-4c15-97ae-4efcd8047d90" (UID: "303bfc40-7c6c-4c15-97ae-4efcd8047d90"). InnerVolumeSpecName "kube-api-access-rvtnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.559931 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.568984 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.569018 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.569032 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.569041 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.569051 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvtnb\" (UniqueName: \"kubernetes.io/projected/303bfc40-7c6c-4c15-97ae-4efcd8047d90-kube-api-access-rvtnb\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.569064 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303bfc40-7c6c-4c15-97ae-4efcd8047d90-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:49 crc kubenswrapper[4929]: W1002 11:30:49.641767 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod842a33bb_8f7e_468a_96de_cf4d2b4a1d3f.slice/crio-d36ae9f5fc9ba60dce7832f3d0e4cb76f1e0f12d658a9e4badbd8655d11a8150 WatchSource:0}: Error finding container d36ae9f5fc9ba60dce7832f3d0e4cb76f1e0f12d658a9e4badbd8655d11a8150: Status 404 returned error can't find the container with id d36ae9f5fc9ba60dce7832f3d0e4cb76f1e0f12d658a9e4badbd8655d11a8150 Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.642572 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:30:49 crc kubenswrapper[4929]: W1002 11:30:49.789060 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72eb1cc_3002_4941_878e_409ee9abeed1.slice/crio-dfdcdd042f69f51d7e5b6a8fc8175f623f8593f9dbb166f1591da0878719d1a2 WatchSource:0}: Error finding container dfdcdd042f69f51d7e5b6a8fc8175f623f8593f9dbb166f1591da0878719d1a2: Status 404 returned error can't find the container with id dfdcdd042f69f51d7e5b6a8fc8175f623f8593f9dbb166f1591da0878719d1a2 Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.789483 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.809375 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:30:49 crc kubenswrapper[4929]: I1002 11:30:49.918847 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:30:49 crc kubenswrapper[4929]: W1002 11:30:49.922001 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca2d3300_d1c4_45dc_89d7_ef5b9beab085.slice/crio-3c1eff147b3c870bfc44217a0f45364f7fce7d58bf049d812165f2eb81cad180 WatchSource:0}: Error finding container 3c1eff147b3c870bfc44217a0f45364f7fce7d58bf049d812165f2eb81cad180: Status 404 returned error can't find the container with id 3c1eff147b3c870bfc44217a0f45364f7fce7d58bf049d812165f2eb81cad180 Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.382578 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerStarted","Data":"d36ae9f5fc9ba60dce7832f3d0e4cb76f1e0f12d658a9e4badbd8655d11a8150"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.385234 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerStarted","Data":"ed1263d66e888416b6de0df344c59ecb4c81d0133e4c19cd6307cc92dee18214"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.385271 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerStarted","Data":"947ae1e71de670f1d0bb70c61a9fe3dd8f2d2d29259716b8ece947f379221a9d"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.395316 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerStarted","Data":"dfdcdd042f69f51d7e5b6a8fc8175f623f8593f9dbb166f1591da0878719d1a2"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.411296 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerStarted","Data":"462bf80fe859ca45531ff9b09ee7cc2efa2d91d0a837705a92a3941f84e9e3cb"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.412382 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.416293 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerStarted","Data":"75b30becb54620544c631af8adaf9f8ea7d83f419531de75d47384e21d4ffdf4"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.418454 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-l6vqp" Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.418779 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerStarted","Data":"a257ef1006e7dd6159cdad857038b532cf179c6d300280c569748fb9af650c72"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.418809 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerStarted","Data":"3c1eff147b3c870bfc44217a0f45364f7fce7d58bf049d812165f2eb81cad180"} Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.455836 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.265220463 podStartE2EDuration="6.446791202s" podCreationTimestamp="2025-10-02 11:30:44 +0000 UTC" firstStartedPulling="2025-10-02 11:30:45.159802213 +0000 UTC m=+1245.710168577" lastFinishedPulling="2025-10-02 11:30:49.341372952 +0000 UTC m=+1249.891739316" observedRunningTime="2025-10-02 11:30:50.429863086 +0000 UTC m=+1250.980229450" watchObservedRunningTime="2025-10-02 11:30:50.446791202 +0000 UTC m=+1250.997157566" Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.562440 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-l6vqp"] Oct 02 11:30:50 crc kubenswrapper[4929]: I1002 11:30:50.583129 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-l6vqp"] Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.428924 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerStarted","Data":"e4177002c44e8d7a21e159caedf153718ca4345bd9c442348bd873ff71cb8ad6"} Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.429394 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.429416 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.434310 4929 generic.go:334] "Generic (PLEG): container finished" podID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerID="ed1263d66e888416b6de0df344c59ecb4c81d0133e4c19cd6307cc92dee18214" exitCode=0 Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.434381 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerDied","Data":"ed1263d66e888416b6de0df344c59ecb4c81d0133e4c19cd6307cc92dee18214"} Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.434403 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerStarted","Data":"166242403a9a523000d1d0c321aee5081ce107e0acc826bfcb01bb1992281633"} Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.434504 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.437991 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerStarted","Data":"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd"} Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.438017 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerStarted","Data":"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d"} Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.438031 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.476322 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69dbd5cc54-74gz4" podStartSLOduration=3.476304769 podStartE2EDuration="3.476304769s" podCreationTimestamp="2025-10-02 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:51.455732016 +0000 UTC m=+1252.006098380" watchObservedRunningTime="2025-10-02 11:30:51.476304769 +0000 UTC m=+1252.026671123" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.485696 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d65bddd44-jz54h" podStartSLOduration=3.485682191 podStartE2EDuration="3.485682191s" podCreationTimestamp="2025-10-02 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:51.473083893 +0000 UTC m=+1252.023450277" watchObservedRunningTime="2025-10-02 11:30:51.485682191 +0000 UTC m=+1252.036048555" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.503897 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" podStartSLOduration=3.503877021 podStartE2EDuration="3.503877021s" podCreationTimestamp="2025-10-02 11:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:51.498767993 +0000 UTC m=+1252.049134357" watchObservedRunningTime="2025-10-02 11:30:51.503877021 +0000 UTC m=+1252.054243405" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.852128 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.853906 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.857691 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.858458 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.878101 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.980948 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981041 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmsj\" (UniqueName: \"kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981463 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981564 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981654 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:51 crc kubenswrapper[4929]: I1002 11:30:51.981921 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084050 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmsj\" (UniqueName: \"kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084116 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084164 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084201 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084320 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.084365 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.090428 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.090493 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.091177 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.091660 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.092299 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.096305 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.103111 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmsj\" (UniqueName: \"kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj\") pod \"neutron-f54bbfbbc-rzbv9\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.167695 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303bfc40-7c6c-4c15-97ae-4efcd8047d90" path="/var/lib/kubelet/pods/303bfc40-7c6c-4c15-97ae-4efcd8047d90/volumes" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.221114 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.464042 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerStarted","Data":"67d3645ec9cfed216d6036455755f8b22923aae7acb7d9366f616685db4f7af8"} Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.464393 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerStarted","Data":"e3c00d90ab5c8fdbfb94fad352ef76e3b0dd878ba364460bbb331b6a693a2e07"} Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.492940 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" podStartSLOduration=2.741576351 podStartE2EDuration="4.492917389s" podCreationTimestamp="2025-10-02 11:30:48 +0000 UTC" firstStartedPulling="2025-10-02 11:30:49.64574359 +0000 UTC m=+1250.196109954" lastFinishedPulling="2025-10-02 11:30:51.397084628 +0000 UTC m=+1251.947450992" observedRunningTime="2025-10-02 11:30:52.485411327 +0000 UTC m=+1253.035777691" watchObservedRunningTime="2025-10-02 11:30:52.492917389 +0000 UTC m=+1253.043283753" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.747560 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.825021 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:30:52 crc kubenswrapper[4929]: I1002 11:30:52.849431 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:30:53 crc kubenswrapper[4929]: I1002 11:30:53.474248 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerStarted","Data":"bda45a2f099cd2429f8ccbfa6ba0badb7ccc111062372af4cf272f5542413c30"} Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.484721 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerStarted","Data":"4a26eb13a68fc86fca37ccadbc35bdf199a826d5b4a5034fe350778970631e25"} Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.485165 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerStarted","Data":"52e15741d914815b2fb093a46215236fea49e8f8564b50718e5c10df7b9ff3e8"} Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.485188 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.506262 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f54bbfbbc-rzbv9" podStartSLOduration=3.506246055 podStartE2EDuration="3.506246055s" podCreationTimestamp="2025-10-02 11:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:54.501018154 +0000 UTC m=+1255.051384518" watchObservedRunningTime="2025-10-02 11:30:54.506246055 +0000 UTC m=+1255.056612419" Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.991732 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.993092 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.995313 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 11:30:54 crc kubenswrapper[4929]: I1002 11:30:54.996232 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.052031 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145148 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145214 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145254 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145288 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88f6\" (UniqueName: \"kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145350 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145453 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.145482 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.247656 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.247860 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.247974 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.248054 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.248143 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.248225 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m88f6\" (UniqueName: \"kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.248448 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.250673 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.254765 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.255564 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.259347 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.275798 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.275917 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.280643 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88f6\" (UniqueName: \"kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6\") pod \"barbican-api-56d58dd68b-qlcrz\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.323484 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:55 crc kubenswrapper[4929]: I1002 11:30:55.848282 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:30:55 crc kubenswrapper[4929]: W1002 11:30:55.852292 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf0101ab_4fa3_4475_a685_fdd9ebb0ef68.slice/crio-fc31bdfb0f18066d6e9960af0c4a27af9f9f720f2a5b8977bc86d8c65246110e WatchSource:0}: Error finding container fc31bdfb0f18066d6e9960af0c4a27af9f9f720f2a5b8977bc86d8c65246110e: Status 404 returned error can't find the container with id fc31bdfb0f18066d6e9960af0c4a27af9f9f720f2a5b8977bc86d8c65246110e Oct 02 11:30:56 crc kubenswrapper[4929]: I1002 11:30:56.306213 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:30:56 crc kubenswrapper[4929]: I1002 11:30:56.519360 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerStarted","Data":"21cd6c9af3eab7ce82f56ddfb8a37ffb6223a10844bf34a7a2c8ec24da313794"} Oct 02 11:30:56 crc kubenswrapper[4929]: I1002 11:30:56.519412 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerStarted","Data":"fc31bdfb0f18066d6e9960af0c4a27af9f9f720f2a5b8977bc86d8c65246110e"} Oct 02 11:30:56 crc kubenswrapper[4929]: I1002 11:30:56.521538 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerStarted","Data":"241b3468cb9d97e9ba6f173143b44936f01809a56d73599a592eacd87d2efe4d"} Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.449595 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.453886 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.461275 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.462002 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qj4tt" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.462316 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.494320 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.546853 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerStarted","Data":"50ab6d4116dc1e4db95a4dd8529214c90a135250b4f2785bbf13989c07ed52bf"} Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.611088 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.611159 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.611218 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9s4\" (UniqueName: \"kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.611288 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.712734 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.712797 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.712844 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.712901 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9s4\" (UniqueName: \"kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.714736 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.720902 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.722308 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.742196 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9s4\" (UniqueName: \"kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4\") pod \"openstackclient\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " pod="openstack/openstackclient" Oct 02 11:30:58 crc kubenswrapper[4929]: I1002 11:30:58.789064 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.157193 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.224390 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.224743 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="dnsmasq-dns" containerID="cri-o://9613e61314e8d8663e2f8f6c661a9cc50f93cc5a9ce125b53a1bb990f3441f17" gracePeriod=10 Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.263679 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:30:59 crc kubenswrapper[4929]: W1002 11:30:59.429371 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d58654_fb5e_4caf_a555_0a6865d5337c.slice/crio-d1f5acba0fd15d9c5293b01cde8b30928a7cd311181b0da0f7d0f6c3cd162123 WatchSource:0}: Error finding container d1f5acba0fd15d9c5293b01cde8b30928a7cd311181b0da0f7d0f6c3cd162123: Status 404 returned error can't find the container with id d1f5acba0fd15d9c5293b01cde8b30928a7cd311181b0da0f7d0f6c3cd162123 Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.585837 4929 generic.go:334] "Generic (PLEG): container finished" podID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerID="9613e61314e8d8663e2f8f6c661a9cc50f93cc5a9ce125b53a1bb990f3441f17" exitCode=0 Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.585982 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" event={"ID":"471329a2-ca2f-4ba2-b750-32f88de79c8f","Type":"ContainerDied","Data":"9613e61314e8d8663e2f8f6c661a9cc50f93cc5a9ce125b53a1bb990f3441f17"} Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.606322 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4d58654-fb5e-4caf-a555-0a6865d5337c","Type":"ContainerStarted","Data":"d1f5acba0fd15d9c5293b01cde8b30928a7cd311181b0da0f7d0f6c3cd162123"} Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.648927 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerStarted","Data":"e53b3ad5b5ce14f2519bfc0e1a58672bd568af0b97109869b7485930f064cca6"} Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.649017 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.649051 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.685098 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" podStartSLOduration=6.014580536 podStartE2EDuration="11.685079982s" podCreationTimestamp="2025-10-02 11:30:48 +0000 UTC" firstStartedPulling="2025-10-02 11:30:49.549619474 +0000 UTC m=+1250.099985838" lastFinishedPulling="2025-10-02 11:30:55.22011892 +0000 UTC m=+1255.770485284" observedRunningTime="2025-10-02 11:30:59.682390279 +0000 UTC m=+1260.232756643" watchObservedRunningTime="2025-10-02 11:30:59.685079982 +0000 UTC m=+1260.235446356" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.716087 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56d58dd68b-qlcrz" podStartSLOduration=5.716069726 podStartE2EDuration="5.716069726s" podCreationTimestamp="2025-10-02 11:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:30:59.70508538 +0000 UTC m=+1260.255451754" watchObservedRunningTime="2025-10-02 11:30:59.716069726 +0000 UTC m=+1260.266436090" Oct 02 11:30:59 crc kubenswrapper[4929]: I1002 11:30:59.957414 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045205 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045259 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045392 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mpgn\" (UniqueName: \"kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045447 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045496 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.045540 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb\") pod \"471329a2-ca2f-4ba2-b750-32f88de79c8f\" (UID: \"471329a2-ca2f-4ba2-b750-32f88de79c8f\") " Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.053807 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn" (OuterVolumeSpecName: "kube-api-access-4mpgn") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "kube-api-access-4mpgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.121610 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config" (OuterVolumeSpecName: "config") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.136122 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.149795 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.149827 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mpgn\" (UniqueName: \"kubernetes.io/projected/471329a2-ca2f-4ba2-b750-32f88de79c8f-kube-api-access-4mpgn\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.149838 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.162564 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.190589 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.205858 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "471329a2-ca2f-4ba2-b750-32f88de79c8f" (UID: "471329a2-ca2f-4ba2-b750-32f88de79c8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.253500 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.253548 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.253562 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471329a2-ca2f-4ba2-b750-32f88de79c8f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.660772 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" event={"ID":"471329a2-ca2f-4ba2-b750-32f88de79c8f","Type":"ContainerDied","Data":"1c844c4fb650f96541a30efba52b4f503a21085a0c0b23e8e559591d823b5b02"} Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.661072 4929 scope.go:117] "RemoveContainer" containerID="9613e61314e8d8663e2f8f6c661a9cc50f93cc5a9ce125b53a1bb990f3441f17" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.661206 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-mh9sd" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.672748 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zr8dn" event={"ID":"69bfcad7-d630-4361-b28d-f072ac3f84a0","Type":"ContainerStarted","Data":"7dc3a73ce0b7c90f23c4ff562b6e91fe7ea7b7babea5572654dcc3b8d39ab6c5"} Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.704653 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zr8dn" podStartSLOduration=2.899025922 podStartE2EDuration="41.704635721s" podCreationTimestamp="2025-10-02 11:30:19 +0000 UTC" firstStartedPulling="2025-10-02 11:30:20.64855295 +0000 UTC m=+1221.198919314" lastFinishedPulling="2025-10-02 11:30:59.454162749 +0000 UTC m=+1260.004529113" observedRunningTime="2025-10-02 11:31:00.691230881 +0000 UTC m=+1261.241597245" watchObservedRunningTime="2025-10-02 11:31:00.704635721 +0000 UTC m=+1261.255002085" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.712605 4929 scope.go:117] "RemoveContainer" containerID="6f688d9ddf62796d832ca0848dbf1c98b55fce41f99c8e828191b4d34de8fa8d" Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.716875 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:31:00 crc kubenswrapper[4929]: I1002 11:31:00.726113 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-mh9sd"] Oct 02 11:31:01 crc kubenswrapper[4929]: I1002 11:31:01.310889 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:31:01 crc kubenswrapper[4929]: I1002 11:31:01.338981 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:31:02 crc kubenswrapper[4929]: I1002 11:31:02.196582 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" path="/var/lib/kubelet/pods/471329a2-ca2f-4ba2-b750-32f88de79c8f/volumes" Oct 02 11:31:04 crc kubenswrapper[4929]: I1002 11:31:04.700610 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.343946 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.408421 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.408706 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69dbd5cc54-74gz4" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api-log" containerID="cri-o://a257ef1006e7dd6159cdad857038b532cf179c6d300280c569748fb9af650c72" gracePeriod=30 Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.409048 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69dbd5cc54-74gz4" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api" containerID="cri-o://e4177002c44e8d7a21e159caedf153718ca4345bd9c442348bd873ff71cb8ad6" gracePeriod=30 Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.751848 4929 generic.go:334] "Generic (PLEG): container finished" podID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerID="a257ef1006e7dd6159cdad857038b532cf179c6d300280c569748fb9af650c72" exitCode=143 Oct 02 11:31:06 crc kubenswrapper[4929]: I1002 11:31:06.751893 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerDied","Data":"a257ef1006e7dd6159cdad857038b532cf179c6d300280c569748fb9af650c72"} Oct 02 11:31:09 crc kubenswrapper[4929]: I1002 11:31:09.566096 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69dbd5cc54-74gz4" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:44772->10.217.0.158:9311: read: connection reset by peer" Oct 02 11:31:09 crc kubenswrapper[4929]: I1002 11:31:09.566130 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69dbd5cc54-74gz4" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:44780->10.217.0.158:9311: read: connection reset by peer" Oct 02 11:31:09 crc kubenswrapper[4929]: I1002 11:31:09.782763 4929 generic.go:334] "Generic (PLEG): container finished" podID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerID="e4177002c44e8d7a21e159caedf153718ca4345bd9c442348bd873ff71cb8ad6" exitCode=0 Oct 02 11:31:09 crc kubenswrapper[4929]: I1002 11:31:09.782809 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerDied","Data":"e4177002c44e8d7a21e159caedf153718ca4345bd9c442348bd873ff71cb8ad6"} Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.027467 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:31:11 crc kubenswrapper[4929]: E1002 11:31:11.027900 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="init" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.027914 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="init" Oct 02 11:31:11 crc kubenswrapper[4929]: E1002 11:31:11.027939 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="dnsmasq-dns" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.027945 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="dnsmasq-dns" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.028137 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="471329a2-ca2f-4ba2-b750-32f88de79c8f" containerName="dnsmasq-dns" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.029076 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.037423 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.037442 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.037490 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.037876 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103296 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103371 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103398 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103443 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46q5k\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103463 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103491 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.103552 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205453 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205603 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205643 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205677 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205709 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205743 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46q5k\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205766 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.205794 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.206355 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.206675 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.211340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.211840 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.217734 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.217753 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.217878 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.224491 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46q5k\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k\") pod \"swift-proxy-69566f664c-jps5x\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.348714 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.599029 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.614059 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle\") pod \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.614198 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom\") pod \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.614238 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj9qw\" (UniqueName: \"kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw\") pod \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.614265 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data\") pod \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.614313 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs\") pod \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\" (UID: \"ca2d3300-d1c4-45dc-89d7-ef5b9beab085\") " Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.615233 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs" (OuterVolumeSpecName: "logs") pod "ca2d3300-d1c4-45dc-89d7-ef5b9beab085" (UID: "ca2d3300-d1c4-45dc-89d7-ef5b9beab085"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.621095 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca2d3300-d1c4-45dc-89d7-ef5b9beab085" (UID: "ca2d3300-d1c4-45dc-89d7-ef5b9beab085"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.629614 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw" (OuterVolumeSpecName: "kube-api-access-vj9qw") pod "ca2d3300-d1c4-45dc-89d7-ef5b9beab085" (UID: "ca2d3300-d1c4-45dc-89d7-ef5b9beab085"). InnerVolumeSpecName "kube-api-access-vj9qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.656997 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca2d3300-d1c4-45dc-89d7-ef5b9beab085" (UID: "ca2d3300-d1c4-45dc-89d7-ef5b9beab085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.666512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data" (OuterVolumeSpecName: "config-data") pod "ca2d3300-d1c4-45dc-89d7-ef5b9beab085" (UID: "ca2d3300-d1c4-45dc-89d7-ef5b9beab085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.715900 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.715933 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.715943 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.715954 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.715979 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj9qw\" (UniqueName: \"kubernetes.io/projected/ca2d3300-d1c4-45dc-89d7-ef5b9beab085-kube-api-access-vj9qw\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.803371 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69dbd5cc54-74gz4" event={"ID":"ca2d3300-d1c4-45dc-89d7-ef5b9beab085","Type":"ContainerDied","Data":"3c1eff147b3c870bfc44217a0f45364f7fce7d58bf049d812165f2eb81cad180"} Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.803432 4929 scope.go:117] "RemoveContainer" containerID="e4177002c44e8d7a21e159caedf153718ca4345bd9c442348bd873ff71cb8ad6" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.803458 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69dbd5cc54-74gz4" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.828689 4929 scope.go:117] "RemoveContainer" containerID="a257ef1006e7dd6159cdad857038b532cf179c6d300280c569748fb9af650c72" Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.848205 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.856776 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69dbd5cc54-74gz4"] Oct 02 11:31:11 crc kubenswrapper[4929]: I1002 11:31:11.948170 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:31:11 crc kubenswrapper[4929]: W1002 11:31:11.954816 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23c56c4a_763f_4ce6_8b1f_d862662b16ec.slice/crio-c62d0fecc8e0afb7f30bafad49e36c7e7ef2d820bca5f6117bf9998ba82d0c1c WatchSource:0}: Error finding container c62d0fecc8e0afb7f30bafad49e36c7e7ef2d820bca5f6117bf9998ba82d0c1c: Status 404 returned error can't find the container with id c62d0fecc8e0afb7f30bafad49e36c7e7ef2d820bca5f6117bf9998ba82d0c1c Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.189818 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" path="/var/lib/kubelet/pods/ca2d3300-d1c4-45dc-89d7-ef5b9beab085/volumes" Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.813936 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerStarted","Data":"027e91ba08dcef685cce7a361a75e6c896ae57660ea313b7e71bbff0e07f1279"} Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.814249 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerStarted","Data":"a9694eb8911f9ff8551424e463b8bef808eae32090d923b999eba6998b0901c3"} Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.814263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerStarted","Data":"c62d0fecc8e0afb7f30bafad49e36c7e7ef2d820bca5f6117bf9998ba82d0c1c"} Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.814293 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.814310 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.815678 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a4d58654-fb5e-4caf-a555-0a6865d5337c","Type":"ContainerStarted","Data":"f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6"} Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.846368 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-69566f664c-jps5x" podStartSLOduration=1.8463481640000001 podStartE2EDuration="1.846348164s" podCreationTimestamp="2025-10-02 11:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:12.839314735 +0000 UTC m=+1273.389681109" watchObservedRunningTime="2025-10-02 11:31:12.846348164 +0000 UTC m=+1273.396714518" Oct 02 11:31:12 crc kubenswrapper[4929]: I1002 11:31:12.857910 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.5058639940000003 podStartE2EDuration="14.857886894s" podCreationTimestamp="2025-10-02 11:30:58 +0000 UTC" firstStartedPulling="2025-10-02 11:30:59.449231607 +0000 UTC m=+1259.999597971" lastFinishedPulling="2025-10-02 11:31:11.801254507 +0000 UTC m=+1272.351620871" observedRunningTime="2025-10-02 11:31:12.852537391 +0000 UTC m=+1273.402903755" watchObservedRunningTime="2025-10-02 11:31:12.857886894 +0000 UTC m=+1273.408253258" Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.730224 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.736580 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.736691 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.736766 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.737763 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.737875 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217" gracePeriod=600 Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.926516 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.926777 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-central-agent" containerID="cri-o://74039636db2c6dff50014462c615eaa09b841fcc8d33ff434d4ea85ffda62f92" gracePeriod=30 Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.926891 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="sg-core" containerID="cri-o://99dbf9e33c9c9434c7a7136f2e279fea82b921758a2405426dbc1a3112f2ec84" gracePeriod=30 Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.927035 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="proxy-httpd" containerID="cri-o://462bf80fe859ca45531ff9b09ee7cc2efa2d91d0a837705a92a3941f84e9e3cb" gracePeriod=30 Oct 02 11:31:14 crc kubenswrapper[4929]: I1002 11:31:14.927053 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-notification-agent" containerID="cri-o://ba736854a52194f11e467bacf9703ab3db3238818aad51ed6a333fa0cc04d412" gracePeriod=30 Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.849131 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217" exitCode=0 Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.849177 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217"} Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.849526 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661"} Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.849548 4929 scope.go:117] "RemoveContainer" containerID="4f30c8067764cbf742a0d9d0a1f047810aa84e3e7853a564b95946cb32658616" Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853258 4929 generic.go:334] "Generic (PLEG): container finished" podID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerID="462bf80fe859ca45531ff9b09ee7cc2efa2d91d0a837705a92a3941f84e9e3cb" exitCode=0 Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853291 4929 generic.go:334] "Generic (PLEG): container finished" podID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerID="99dbf9e33c9c9434c7a7136f2e279fea82b921758a2405426dbc1a3112f2ec84" exitCode=2 Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853299 4929 generic.go:334] "Generic (PLEG): container finished" podID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerID="74039636db2c6dff50014462c615eaa09b841fcc8d33ff434d4ea85ffda62f92" exitCode=0 Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853321 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerDied","Data":"462bf80fe859ca45531ff9b09ee7cc2efa2d91d0a837705a92a3941f84e9e3cb"} Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853345 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerDied","Data":"99dbf9e33c9c9434c7a7136f2e279fea82b921758a2405426dbc1a3112f2ec84"} Oct 02 11:31:15 crc kubenswrapper[4929]: I1002 11:31:15.853355 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerDied","Data":"74039636db2c6dff50014462c615eaa09b841fcc8d33ff434d4ea85ffda62f92"} Oct 02 11:31:16 crc kubenswrapper[4929]: I1002 11:31:16.872922 4929 generic.go:334] "Generic (PLEG): container finished" podID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerID="ba736854a52194f11e467bacf9703ab3db3238818aad51ed6a333fa0cc04d412" exitCode=0 Oct 02 11:31:16 crc kubenswrapper[4929]: I1002 11:31:16.873394 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerDied","Data":"ba736854a52194f11e467bacf9703ab3db3238818aad51ed6a333fa0cc04d412"} Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.103324 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137465 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137607 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137673 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137759 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137801 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137850 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4v5v\" (UniqueName: \"kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.137892 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd\") pod \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\" (UID: \"1eb9e992-28b7-4e84-acd9-6022cbaced5e\") " Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.139273 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.141289 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.145058 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v" (OuterVolumeSpecName: "kube-api-access-g4v5v") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "kube-api-access-g4v5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.150681 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts" (OuterVolumeSpecName: "scripts") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.177468 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.212787 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240046 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240072 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240082 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240091 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240100 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4v5v\" (UniqueName: \"kubernetes.io/projected/1eb9e992-28b7-4e84-acd9-6022cbaced5e-kube-api-access-g4v5v\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.240110 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1eb9e992-28b7-4e84-acd9-6022cbaced5e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.241467 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data" (OuterVolumeSpecName: "config-data") pod "1eb9e992-28b7-4e84-acd9-6022cbaced5e" (UID: "1eb9e992-28b7-4e84-acd9-6022cbaced5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.341801 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eb9e992-28b7-4e84-acd9-6022cbaced5e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.769589 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.895851 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="469da009-8740-4581-90b5-1e99b80a7f81" containerName="kube-state-metrics" containerID="cri-o://c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad" gracePeriod=30 Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.896205 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.904286 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1eb9e992-28b7-4e84-acd9-6022cbaced5e","Type":"ContainerDied","Data":"b762613924f74d92b50fecfef1e6b05937644de4a3d6381adfb5b81254eed4c6"} Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.904355 4929 scope.go:117] "RemoveContainer" containerID="462bf80fe859ca45531ff9b09ee7cc2efa2d91d0a837705a92a3941f84e9e3cb" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.947150 4929 scope.go:117] "RemoveContainer" containerID="99dbf9e33c9c9434c7a7136f2e279fea82b921758a2405426dbc1a3112f2ec84" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.948339 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.965825 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.973442 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.973911 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-central-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.973937 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-central-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.973975 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.973986 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api" Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.974019 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="sg-core" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974032 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="sg-core" Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.974048 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api-log" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974060 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api-log" Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.974083 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="proxy-httpd" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974094 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="proxy-httpd" Oct 02 11:31:17 crc kubenswrapper[4929]: E1002 11:31:17.974112 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-notification-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974146 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-notification-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974414 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974433 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2d3300-d1c4-45dc-89d7-ef5b9beab085" containerName="barbican-api-log" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974447 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-notification-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974462 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="sg-core" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974471 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="ceilometer-central-agent" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.974484 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" containerName="proxy-httpd" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.976623 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.979838 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.980006 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.980750 4929 scope.go:117] "RemoveContainer" containerID="ba736854a52194f11e467bacf9703ab3db3238818aad51ed6a333fa0cc04d412" Oct 02 11:31:17 crc kubenswrapper[4929]: I1002 11:31:17.981925 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.056747 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.056808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.056848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.056985 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.057037 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.057061 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.057114 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwd96\" (UniqueName: \"kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.083024 4929 scope.go:117] "RemoveContainer" containerID="74039636db2c6dff50014462c615eaa09b841fcc8d33ff434d4ea85ffda62f92" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158336 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158415 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158438 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwd96\" (UniqueName: \"kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158517 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158544 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.158576 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.159055 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.159163 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.171656 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb9e992-28b7-4e84-acd9-6022cbaced5e" path="/var/lib/kubelet/pods/1eb9e992-28b7-4e84-acd9-6022cbaced5e/volumes" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.177088 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.177153 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwd96\" (UniqueName: \"kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.177307 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.180340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.181526 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data\") pod \"ceilometer-0\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.383128 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.444914 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.461894 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrnf6\" (UniqueName: \"kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6\") pod \"469da009-8740-4581-90b5-1e99b80a7f81\" (UID: \"469da009-8740-4581-90b5-1e99b80a7f81\") " Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.468272 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6" (OuterVolumeSpecName: "kube-api-access-qrnf6") pod "469da009-8740-4581-90b5-1e99b80a7f81" (UID: "469da009-8740-4581-90b5-1e99b80a7f81"). InnerVolumeSpecName "kube-api-access-qrnf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.564353 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrnf6\" (UniqueName: \"kubernetes.io/projected/469da009-8740-4581-90b5-1e99b80a7f81-kube-api-access-qrnf6\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.818205 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:18 crc kubenswrapper[4929]: W1002 11:31:18.826930 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0696b9dd_264f_4238_bfb7_0268dcc333e5.slice/crio-f3f1355b3480de3134c5546fbba254b12bd6f8442f43d83d991fda9e18c01eac WatchSource:0}: Error finding container f3f1355b3480de3134c5546fbba254b12bd6f8442f43d83d991fda9e18c01eac: Status 404 returned error can't find the container with id f3f1355b3480de3134c5546fbba254b12bd6f8442f43d83d991fda9e18c01eac Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.906471 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerStarted","Data":"f3f1355b3480de3134c5546fbba254b12bd6f8442f43d83d991fda9e18c01eac"} Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.908015 4929 generic.go:334] "Generic (PLEG): container finished" podID="469da009-8740-4581-90b5-1e99b80a7f81" containerID="c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad" exitCode=2 Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.908043 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469da009-8740-4581-90b5-1e99b80a7f81","Type":"ContainerDied","Data":"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad"} Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.908066 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469da009-8740-4581-90b5-1e99b80a7f81","Type":"ContainerDied","Data":"33198bda049795a7c7dd988fabccec3edfb18ccb9fc422050c86f615c36fb3f9"} Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.908082 4929 scope.go:117] "RemoveContainer" containerID="c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.908109 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.928787 4929 scope.go:117] "RemoveContainer" containerID="c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad" Oct 02 11:31:18 crc kubenswrapper[4929]: E1002 11:31:18.929301 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad\": container with ID starting with c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad not found: ID does not exist" containerID="c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.929340 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad"} err="failed to get container status \"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad\": rpc error: code = NotFound desc = could not find container \"c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad\": container with ID starting with c2ba4ebe17c988d4e4eab064d520a14deef6346c1215c83b2499164ed0c062ad not found: ID does not exist" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.951898 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.962987 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.986309 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:18 crc kubenswrapper[4929]: E1002 11:31:18.987118 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469da009-8740-4581-90b5-1e99b80a7f81" containerName="kube-state-metrics" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.987145 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="469da009-8740-4581-90b5-1e99b80a7f81" containerName="kube-state-metrics" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.987639 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="469da009-8740-4581-90b5-1e99b80a7f81" containerName="kube-state-metrics" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.988845 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.996527 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 11:31:18 crc kubenswrapper[4929]: I1002 11:31:18.996624 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.001610 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.032480 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.087900 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.088195 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5sm\" (UniqueName: \"kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.088239 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.088281 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.190471 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.190565 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5sm\" (UniqueName: \"kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.190603 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.190641 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.197850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.202805 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.204893 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.210022 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5sm\" (UniqueName: \"kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm\") pod \"kube-state-metrics-0\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.310636 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.730832 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:31:19 crc kubenswrapper[4929]: W1002 11:31:19.738376 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51f77cee_69d5_4e5c_8707_a5be1914e351.slice/crio-35d4f0c7ff12dfe5718821bd45e1f568401bdce14c220b92a3601e52685f7661 WatchSource:0}: Error finding container 35d4f0c7ff12dfe5718821bd45e1f568401bdce14c220b92a3601e52685f7661: Status 404 returned error can't find the container with id 35d4f0c7ff12dfe5718821bd45e1f568401bdce14c220b92a3601e52685f7661 Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.920857 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerStarted","Data":"a685f49e8ccebbdb3014bb3b0acc97afba3ace252dbadee82a874614d63e85f0"} Oct 02 11:31:19 crc kubenswrapper[4929]: I1002 11:31:19.923332 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51f77cee-69d5-4e5c-8707-a5be1914e351","Type":"ContainerStarted","Data":"35d4f0c7ff12dfe5718821bd45e1f568401bdce14c220b92a3601e52685f7661"} Oct 02 11:31:20 crc kubenswrapper[4929]: I1002 11:31:20.169005 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469da009-8740-4581-90b5-1e99b80a7f81" path="/var/lib/kubelet/pods/469da009-8740-4581-90b5-1e99b80a7f81/volumes" Oct 02 11:31:20 crc kubenswrapper[4929]: I1002 11:31:20.740679 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.353545 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.359699 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.946277 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51f77cee-69d5-4e5c-8707-a5be1914e351","Type":"ContainerStarted","Data":"2cae7ce5a315b3a1de86b3016d5d89f4784fd9aa5ec045ee66d7c930f75073c6"} Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.946572 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.949439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerStarted","Data":"7cef31376c1a988cee5e9685c5addae332685db009f93594773be98bf8c35a42"} Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.949468 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerStarted","Data":"279a9d22ea46e070e9328f1f77e447d82ff7f9fa5db938dca85bb0f5307bcefd"} Oct 02 11:31:21 crc kubenswrapper[4929]: I1002 11:31:21.966568 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.021805675 podStartE2EDuration="3.966549477s" podCreationTimestamp="2025-10-02 11:31:18 +0000 UTC" firstStartedPulling="2025-10-02 11:31:19.740178561 +0000 UTC m=+1280.290544925" lastFinishedPulling="2025-10-02 11:31:20.684922363 +0000 UTC m=+1281.235288727" observedRunningTime="2025-10-02 11:31:21.964682683 +0000 UTC m=+1282.515049057" watchObservedRunningTime="2025-10-02 11:31:21.966549477 +0000 UTC m=+1282.516915841" Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.238943 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.314045 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.314286 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d65bddd44-jz54h" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-api" containerID="cri-o://8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd" gracePeriod=30 Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.314448 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6d65bddd44-jz54h" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-httpd" containerID="cri-o://aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d" gracePeriod=30 Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.967225 4929 generic.go:334] "Generic (PLEG): container finished" podID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerID="aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d" exitCode=0 Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.967573 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerDied","Data":"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d"} Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.969568 4929 generic.go:334] "Generic (PLEG): container finished" podID="69bfcad7-d630-4361-b28d-f072ac3f84a0" containerID="7dc3a73ce0b7c90f23c4ff562b6e91fe7ea7b7babea5572654dcc3b8d39ab6c5" exitCode=0 Oct 02 11:31:22 crc kubenswrapper[4929]: I1002 11:31:22.969655 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zr8dn" event={"ID":"69bfcad7-d630-4361-b28d-f072ac3f84a0","Type":"ContainerDied","Data":"7dc3a73ce0b7c90f23c4ff562b6e91fe7ea7b7babea5572654dcc3b8d39ab6c5"} Oct 02 11:31:23 crc kubenswrapper[4929]: I1002 11:31:23.979228 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerStarted","Data":"3387b0f610a0771b5e4b0afc03a43a2a8b253379e8cf5aca9aeff9b78fc5774a"} Oct 02 11:31:23 crc kubenswrapper[4929]: I1002 11:31:23.979644 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-central-agent" containerID="cri-o://a685f49e8ccebbdb3014bb3b0acc97afba3ace252dbadee82a874614d63e85f0" gracePeriod=30 Oct 02 11:31:23 crc kubenswrapper[4929]: I1002 11:31:23.979714 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="sg-core" containerID="cri-o://7cef31376c1a988cee5e9685c5addae332685db009f93594773be98bf8c35a42" gracePeriod=30 Oct 02 11:31:23 crc kubenswrapper[4929]: I1002 11:31:23.979738 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="proxy-httpd" containerID="cri-o://3387b0f610a0771b5e4b0afc03a43a2a8b253379e8cf5aca9aeff9b78fc5774a" gracePeriod=30 Oct 02 11:31:23 crc kubenswrapper[4929]: I1002 11:31:23.979788 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-notification-agent" containerID="cri-o://279a9d22ea46e070e9328f1f77e447d82ff7f9fa5db938dca85bb0f5307bcefd" gracePeriod=30 Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.018385 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.991361796 podStartE2EDuration="7.018364004s" podCreationTimestamp="2025-10-02 11:31:17 +0000 UTC" firstStartedPulling="2025-10-02 11:31:18.829738297 +0000 UTC m=+1279.380104661" lastFinishedPulling="2025-10-02 11:31:22.856740505 +0000 UTC m=+1283.407106869" observedRunningTime="2025-10-02 11:31:24.007507861 +0000 UTC m=+1284.557874235" watchObservedRunningTime="2025-10-02 11:31:24.018364004 +0000 UTC m=+1284.568730368" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.350287 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387478 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387629 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387738 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s8g5\" (UniqueName: \"kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387799 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387844 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387911 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data\") pod \"69bfcad7-d630-4361-b28d-f072ac3f84a0\" (UID: \"69bfcad7-d630-4361-b28d-f072ac3f84a0\") " Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.387639 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.393691 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.394290 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5" (OuterVolumeSpecName: "kube-api-access-4s8g5") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "kube-api-access-4s8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.396768 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts" (OuterVolumeSpecName: "scripts") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.416659 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.459268 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data" (OuterVolumeSpecName: "config-data") pod "69bfcad7-d630-4361-b28d-f072ac3f84a0" (UID: "69bfcad7-d630-4361-b28d-f072ac3f84a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.479448 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.479696 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-log" containerID="cri-o://33d493d22c989aac977ca56b30eaa4751bee9cf5bc3d357763c0d2fbfd6345c3" gracePeriod=30 Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.479880 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-httpd" containerID="cri-o://bf1df801c33923c52b57d70ffbeb276141d6b88cfc7248fef720e8247b706bf1" gracePeriod=30 Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492102 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s8g5\" (UniqueName: \"kubernetes.io/projected/69bfcad7-d630-4361-b28d-f072ac3f84a0-kube-api-access-4s8g5\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492135 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492145 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492156 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492164 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69bfcad7-d630-4361-b28d-f072ac3f84a0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.492172 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69bfcad7-d630-4361-b28d-f072ac3f84a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.786831 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5r4p6"] Oct 02 11:31:24 crc kubenswrapper[4929]: E1002 11:31:24.798712 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" containerName="cinder-db-sync" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.799009 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" containerName="cinder-db-sync" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.799409 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" containerName="cinder-db-sync" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.801329 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.799951 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5r4p6"] Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.890239 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vcg7z"] Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.891664 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.900401 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vcg7z"] Oct 02 11:31:24 crc kubenswrapper[4929]: I1002 11:31:24.902914 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f942\" (UniqueName: \"kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942\") pod \"nova-api-db-create-5r4p6\" (UID: \"4783768c-7664-4ff2-98a8-69e9d744462c\") " pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.002094 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zr8dn" event={"ID":"69bfcad7-d630-4361-b28d-f072ac3f84a0","Type":"ContainerDied","Data":"9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.002138 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b34ab613530d9e75905f5f57186438f8859c4a643142b0f3841e78f9d898fb3" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.002222 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zr8dn" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.004882 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-m8bs6"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.005416 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kx7t\" (UniqueName: \"kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t\") pod \"nova-cell0-db-create-vcg7z\" (UID: \"c392507b-f994-40e8-aa01-63c09691bfa0\") " pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.005592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f942\" (UniqueName: \"kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942\") pod \"nova-api-db-create-5r4p6\" (UID: \"4783768c-7664-4ff2-98a8-69e9d744462c\") " pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.005938 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.011711 4929 generic.go:334] "Generic (PLEG): container finished" podID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerID="33d493d22c989aac977ca56b30eaa4751bee9cf5bc3d357763c0d2fbfd6345c3" exitCode=143 Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.011789 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerDied","Data":"33d493d22c989aac977ca56b30eaa4751bee9cf5bc3d357763c0d2fbfd6345c3"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.025834 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m8bs6"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026425 4929 generic.go:334] "Generic (PLEG): container finished" podID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerID="3387b0f610a0771b5e4b0afc03a43a2a8b253379e8cf5aca9aeff9b78fc5774a" exitCode=0 Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026455 4929 generic.go:334] "Generic (PLEG): container finished" podID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerID="7cef31376c1a988cee5e9685c5addae332685db009f93594773be98bf8c35a42" exitCode=2 Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026465 4929 generic.go:334] "Generic (PLEG): container finished" podID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerID="279a9d22ea46e070e9328f1f77e447d82ff7f9fa5db938dca85bb0f5307bcefd" exitCode=0 Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026474 4929 generic.go:334] "Generic (PLEG): container finished" podID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerID="a685f49e8ccebbdb3014bb3b0acc97afba3ace252dbadee82a874614d63e85f0" exitCode=0 Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026504 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerDied","Data":"3387b0f610a0771b5e4b0afc03a43a2a8b253379e8cf5aca9aeff9b78fc5774a"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026549 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerDied","Data":"7cef31376c1a988cee5e9685c5addae332685db009f93594773be98bf8c35a42"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026568 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerDied","Data":"279a9d22ea46e070e9328f1f77e447d82ff7f9fa5db938dca85bb0f5307bcefd"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.026579 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerDied","Data":"a685f49e8ccebbdb3014bb3b0acc97afba3ace252dbadee82a874614d63e85f0"} Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.039624 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f942\" (UniqueName: \"kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942\") pod \"nova-api-db-create-5r4p6\" (UID: \"4783768c-7664-4ff2-98a8-69e9d744462c\") " pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.045233 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.106265 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.106538 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwd96\" (UniqueName: \"kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.106649 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.106786 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.106869 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.107086 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.107422 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts\") pod \"0696b9dd-264f-4238-bfb7-0268dcc333e5\" (UID: \"0696b9dd-264f-4238-bfb7-0268dcc333e5\") " Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.107762 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.107910 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwjp\" (UniqueName: \"kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp\") pod \"nova-cell1-db-create-m8bs6\" (UID: \"20069136-d22e-4c10-b7c1-a4a735c4aaf3\") " pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.108059 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kx7t\" (UniqueName: \"kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t\") pod \"nova-cell0-db-create-vcg7z\" (UID: \"c392507b-f994-40e8-aa01-63c09691bfa0\") " pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.108229 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.110011 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.116783 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96" (OuterVolumeSpecName: "kube-api-access-bwd96") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "kube-api-access-bwd96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.120460 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.169890 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kx7t\" (UniqueName: \"kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t\") pod \"nova-cell0-db-create-vcg7z\" (UID: \"c392507b-f994-40e8-aa01-63c09691bfa0\") " pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.169900 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts" (OuterVolumeSpecName: "scripts") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.177131 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.212624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwjp\" (UniqueName: \"kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp\") pod \"nova-cell1-db-create-m8bs6\" (UID: \"20069136-d22e-4c10-b7c1-a4a735c4aaf3\") " pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.212755 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwd96\" (UniqueName: \"kubernetes.io/projected/0696b9dd-264f-4238-bfb7-0268dcc333e5-kube-api-access-bwd96\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.212771 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.212782 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0696b9dd-264f-4238-bfb7-0268dcc333e5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.212793 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.244382 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.245651 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:25 crc kubenswrapper[4929]: E1002 11:31:25.247229 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-central-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247280 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-central-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: E1002 11:31:25.247302 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="sg-core" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247311 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="sg-core" Oct 02 11:31:25 crc kubenswrapper[4929]: E1002 11:31:25.247369 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-notification-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247376 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-notification-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: E1002 11:31:25.247388 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="proxy-httpd" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247394 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="proxy-httpd" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247718 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-notification-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247731 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="sg-core" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247769 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="proxy-httpd" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.247782 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" containerName="ceilometer-central-agent" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.255641 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.264931 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.265270 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qpp2l" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.265517 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.267926 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.270123 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwjp\" (UniqueName: \"kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp\") pod \"nova-cell1-db-create-m8bs6\" (UID: \"20069136-d22e-4c10-b7c1-a4a735c4aaf3\") " pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.274319 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.314561 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.315940 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.316032 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.316090 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skc5b\" (UniqueName: \"kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.316113 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.316140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.316182 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.321109 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.356595 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.404301 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.413145 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420212 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420251 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwdf\" (UniqueName: \"kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420332 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420361 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420405 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420427 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420507 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420572 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skc5b\" (UniqueName: \"kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420594 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420654 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420679 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.420744 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.422914 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.432765 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.445898 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.457879 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.459831 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skc5b\" (UniqueName: \"kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.470717 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.476361 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.503072 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.505324 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.512585 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.515752 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.519219 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data" (OuterVolumeSpecName: "config-data") pod "0696b9dd-264f-4238-bfb7-0268dcc333e5" (UID: "0696b9dd-264f-4238-bfb7-0268dcc333e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.524973 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525032 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525066 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525131 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525148 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkwdf\" (UniqueName: \"kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525167 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525183 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525205 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525235 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525276 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nhw\" (UniqueName: \"kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525307 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525331 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.525383 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0696b9dd-264f-4238-bfb7-0268dcc333e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.526507 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.527517 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.538584 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.539884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.539926 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.564653 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkwdf\" (UniqueName: \"kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf\") pod \"dnsmasq-dns-6578955fd5-t56n4\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627019 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627081 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nhw\" (UniqueName: \"kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627180 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627244 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627283 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627500 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.627851 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.631055 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.631178 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.631931 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.632176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.643901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nhw\" (UniqueName: \"kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw\") pod \"cinder-api-0\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.807151 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.870371 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:25 crc kubenswrapper[4929]: I1002 11:31:25.891746 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5r4p6"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.019493 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vcg7z"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.034980 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m8bs6"] Oct 02 11:31:26 crc kubenswrapper[4929]: W1002 11:31:26.038754 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20069136_d22e_4c10_b7c1_a4a735c4aaf3.slice/crio-a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84 WatchSource:0}: Error finding container a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84: Status 404 returned error can't find the container with id a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84 Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.060795 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.060788 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0696b9dd-264f-4238-bfb7-0268dcc333e5","Type":"ContainerDied","Data":"f3f1355b3480de3134c5546fbba254b12bd6f8442f43d83d991fda9e18c01eac"} Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.061087 4929 scope.go:117] "RemoveContainer" containerID="3387b0f610a0771b5e4b0afc03a43a2a8b253379e8cf5aca9aeff9b78fc5774a" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.065313 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5r4p6" event={"ID":"4783768c-7664-4ff2-98a8-69e9d744462c","Type":"ContainerStarted","Data":"507c57d1054c8079d2c287a4a48364b823334726bda879b0b6bcdbe80bb915ff"} Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.131219 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.134787 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.146249 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.147333 4929 scope.go:117] "RemoveContainer" containerID="7cef31376c1a988cee5e9685c5addae332685db009f93594773be98bf8c35a42" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.148916 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.152185 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.153107 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.153323 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.228202 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0696b9dd-264f-4238-bfb7-0268dcc333e5" path="/var/lib/kubelet/pods/0696b9dd-264f-4238-bfb7-0268dcc333e5/volumes" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238482 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238541 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238593 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238621 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238646 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238715 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238744 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8cm\" (UniqueName: \"kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.238826 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.246877 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.284644 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342086 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342398 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342422 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342438 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342462 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342494 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342520 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8cm\" (UniqueName: \"kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.342569 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.344755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.344781 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.349870 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.352127 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.355646 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.358806 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.362420 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.366826 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8cm\" (UniqueName: \"kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm\") pod \"ceilometer-0\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.375014 4929 scope.go:117] "RemoveContainer" containerID="279a9d22ea46e070e9328f1f77e447d82ff7f9fa5db938dca85bb0f5307bcefd" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.409789 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.516834 4929 scope.go:117] "RemoveContainer" containerID="a685f49e8ccebbdb3014bb3b0acc97afba3ace252dbadee82a874614d63e85f0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.522706 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:26 crc kubenswrapper[4929]: I1002 11:31:26.543329 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.045711 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.046157 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-log" containerID="cri-o://10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1" gracePeriod=30 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.046563 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-httpd" containerID="cri-o://25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7" gracePeriod=30 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.094609 4929 generic.go:334] "Generic (PLEG): container finished" podID="20069136-d22e-4c10-b7c1-a4a735c4aaf3" containerID="cb914ad47f69429fb77bb7b326a163fa90f79b907935df45b1b8e560bf4455c2" exitCode=0 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.094865 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8bs6" event={"ID":"20069136-d22e-4c10-b7c1-a4a735c4aaf3","Type":"ContainerDied","Data":"cb914ad47f69429fb77bb7b326a163fa90f79b907935df45b1b8e560bf4455c2"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.095039 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8bs6" event={"ID":"20069136-d22e-4c10-b7c1-a4a735c4aaf3","Type":"ContainerStarted","Data":"a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.113281 4929 generic.go:334] "Generic (PLEG): container finished" podID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerID="0efbfb7e343f52067e17462a768ba0400daf0b304b0003a143ae764810c04a89" exitCode=0 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.113580 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" event={"ID":"7ca7e74a-8ca4-4657-82ed-22cecb4a9267","Type":"ContainerDied","Data":"0efbfb7e343f52067e17462a768ba0400daf0b304b0003a143ae764810c04a89"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.113609 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" event={"ID":"7ca7e74a-8ca4-4657-82ed-22cecb4a9267","Type":"ContainerStarted","Data":"968e06ee8c084fdf4692fb0b3a4181ac1027be7fe1a467987aea8831ef52a321"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.120102 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.146911 4929 generic.go:334] "Generic (PLEG): container finished" podID="c392507b-f994-40e8-aa01-63c09691bfa0" containerID="33eb9bc027be9b469ed5574f73a2eb7bb9e751142cdb9f8eb6c7c3fbc32f3e27" exitCode=0 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.147008 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcg7z" event={"ID":"c392507b-f994-40e8-aa01-63c09691bfa0","Type":"ContainerDied","Data":"33eb9bc027be9b469ed5574f73a2eb7bb9e751142cdb9f8eb6c7c3fbc32f3e27"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.147042 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcg7z" event={"ID":"c392507b-f994-40e8-aa01-63c09691bfa0","Type":"ContainerStarted","Data":"e8f48c50f47bb01b96c86c805fe89a79f712477f70047c9f2cdc7ed6ae07f3fc"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.155929 4929 generic.go:334] "Generic (PLEG): container finished" podID="4783768c-7664-4ff2-98a8-69e9d744462c" containerID="bb6ab76b4edcd3ece1c809ec2d716793e3d830f8bf38281a676ff06906c37bf9" exitCode=0 Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.156025 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5r4p6" event={"ID":"4783768c-7664-4ff2-98a8-69e9d744462c","Type":"ContainerDied","Data":"bb6ab76b4edcd3ece1c809ec2d716793e3d830f8bf38281a676ff06906c37bf9"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.174935 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerStarted","Data":"d49cb9270a1dcba01008ee178230cf7cbd43c387648a9db325ba9eaeea83ccf4"} Oct 02 11:31:27 crc kubenswrapper[4929]: I1002 11:31:27.179447 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerStarted","Data":"896180156dd66d0a9a94027a59a3a82dfb9ebfc1b2a931bdfeb6109a59afae00"} Oct 02 11:31:27 crc kubenswrapper[4929]: W1002 11:31:27.254073 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9750f97a_b1ec_4ff8_9964_b6fba3c7050d.slice/crio-3265c54a4f57ac64adb9e2e5bd5d2acb111f7a5a5fdcb35e47f1b27d508b19fc WatchSource:0}: Error finding container 3265c54a4f57ac64adb9e2e5bd5d2acb111f7a5a5fdcb35e47f1b27d508b19fc: Status 404 returned error can't find the container with id 3265c54a4f57ac64adb9e2e5bd5d2acb111f7a5a5fdcb35e47f1b27d508b19fc Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.199049 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerStarted","Data":"b4c23593ffd88427329daed4cd6385647bd0a34bbdda04cb685bb9c7e76335c3"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.203103 4929 generic.go:334] "Generic (PLEG): container finished" podID="870cd509-a882-46d4-ae9d-c89ee843385a" containerID="10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1" exitCode=143 Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.203181 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerDied","Data":"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.206578 4929 generic.go:334] "Generic (PLEG): container finished" podID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerID="bf1df801c33923c52b57d70ffbeb276141d6b88cfc7248fef720e8247b706bf1" exitCode=0 Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.206656 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerDied","Data":"bf1df801c33923c52b57d70ffbeb276141d6b88cfc7248fef720e8247b706bf1"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.206705 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31f4d876-4669-4a1f-b3c4-90f39131f726","Type":"ContainerDied","Data":"f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.206720 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f754ae7612b56e4db8f64a6d9adccc56f38f9b0c392233864c54196836efa928" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.210674 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.210828 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" event={"ID":"7ca7e74a-8ca4-4657-82ed-22cecb4a9267","Type":"ContainerStarted","Data":"db29f1886411040e4a08ceb2843d594405344a82aa0e2fab095af2cdffa5fa9c"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.211210 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.212596 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerStarted","Data":"3265c54a4f57ac64adb9e2e5bd5d2acb111f7a5a5fdcb35e47f1b27d508b19fc"} Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.240596 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" podStartSLOduration=3.240570811 podStartE2EDuration="3.240570811s" podCreationTimestamp="2025-10-02 11:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:28.232494269 +0000 UTC m=+1288.782860633" watchObservedRunningTime="2025-10-02 11:31:28.240570811 +0000 UTC m=+1288.790937175" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.286562 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.286912 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.286950 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287019 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287080 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287112 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287135 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p468p\" (UniqueName: \"kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287155 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs\") pod \"31f4d876-4669-4a1f-b3c4-90f39131f726\" (UID: \"31f4d876-4669-4a1f-b3c4-90f39131f726\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287276 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.287732 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.292805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs" (OuterVolumeSpecName: "logs") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.304781 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p" (OuterVolumeSpecName: "kube-api-access-p468p") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "kube-api-access-p468p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.304953 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts" (OuterVolumeSpecName: "scripts") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.306802 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.389346 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.389374 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.389387 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p468p\" (UniqueName: \"kubernetes.io/projected/31f4d876-4669-4a1f-b3c4-90f39131f726-kube-api-access-p468p\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.389398 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31f4d876-4669-4a1f-b3c4-90f39131f726-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.398925 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.403370 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data" (OuterVolumeSpecName: "config-data") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.435183 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.448482 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "31f4d876-4669-4a1f-b3c4-90f39131f726" (UID: "31f4d876-4669-4a1f-b3c4-90f39131f726"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.501392 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.501419 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.501431 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.501510 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f4d876-4669-4a1f-b3c4-90f39131f726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.507542 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.749154 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.803374 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.839379 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.917193 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f942\" (UniqueName: \"kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942\") pod \"4783768c-7664-4ff2-98a8-69e9d744462c\" (UID: \"4783768c-7664-4ff2-98a8-69e9d744462c\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.917300 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwjp\" (UniqueName: \"kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp\") pod \"20069136-d22e-4c10-b7c1-a4a735c4aaf3\" (UID: \"20069136-d22e-4c10-b7c1-a4a735c4aaf3\") " Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.925327 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp" (OuterVolumeSpecName: "kube-api-access-qlwjp") pod "20069136-d22e-4c10-b7c1-a4a735c4aaf3" (UID: "20069136-d22e-4c10-b7c1-a4a735c4aaf3"). InnerVolumeSpecName "kube-api-access-qlwjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.929800 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942" (OuterVolumeSpecName: "kube-api-access-9f942") pod "4783768c-7664-4ff2-98a8-69e9d744462c" (UID: "4783768c-7664-4ff2-98a8-69e9d744462c"). InnerVolumeSpecName "kube-api-access-9f942". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:28 crc kubenswrapper[4929]: I1002 11:31:28.977856 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.019288 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kx7t\" (UniqueName: \"kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t\") pod \"c392507b-f994-40e8-aa01-63c09691bfa0\" (UID: \"c392507b-f994-40e8-aa01-63c09691bfa0\") " Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.019854 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f942\" (UniqueName: \"kubernetes.io/projected/4783768c-7664-4ff2-98a8-69e9d744462c-kube-api-access-9f942\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.019871 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwjp\" (UniqueName: \"kubernetes.io/projected/20069136-d22e-4c10-b7c1-a4a735c4aaf3-kube-api-access-qlwjp\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.024266 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t" (OuterVolumeSpecName: "kube-api-access-8kx7t") pod "c392507b-f994-40e8-aa01-63c09691bfa0" (UID: "c392507b-f994-40e8-aa01-63c09691bfa0"). InnerVolumeSpecName "kube-api-access-8kx7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.123041 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kx7t\" (UniqueName: \"kubernetes.io/projected/c392507b-f994-40e8-aa01-63c09691bfa0-kube-api-access-8kx7t\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.244994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5r4p6" event={"ID":"4783768c-7664-4ff2-98a8-69e9d744462c","Type":"ContainerDied","Data":"507c57d1054c8079d2c287a4a48364b823334726bda879b0b6bcdbe80bb915ff"} Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.245257 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507c57d1054c8079d2c287a4a48364b823334726bda879b0b6bcdbe80bb915ff" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.245314 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5r4p6" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.254035 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerStarted","Data":"30e5a4ae31d94b664d18b31af6f668ff807fcdfc9dab241f49d3553f3335eafe"} Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.255803 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m8bs6" event={"ID":"20069136-d22e-4c10-b7c1-a4a735c4aaf3","Type":"ContainerDied","Data":"a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84"} Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.255921 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a7c4e6de74044701e38366d35f1d23180275869d4f62f0cba72a62e6958c84" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.256042 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m8bs6" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.264123 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcg7z" event={"ID":"c392507b-f994-40e8-aa01-63c09691bfa0","Type":"ContainerDied","Data":"e8f48c50f47bb01b96c86c805fe89a79f712477f70047c9f2cdc7ed6ae07f3fc"} Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.264162 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8f48c50f47bb01b96c86c805fe89a79f712477f70047c9f2cdc7ed6ae07f3fc" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.264225 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcg7z" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.274308 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerStarted","Data":"4e38c20007754dd62f2e497dafa007c2636559fe685e183be5b7bb54601f7a80"} Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.275047 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.332354 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.343580 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.357078 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.370321 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:29 crc kubenswrapper[4929]: E1002 11:31:29.371019 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-httpd" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.371119 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-httpd" Oct 02 11:31:29 crc kubenswrapper[4929]: E1002 11:31:29.371324 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-log" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.371407 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-log" Oct 02 11:31:29 crc kubenswrapper[4929]: E1002 11:31:29.371480 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c392507b-f994-40e8-aa01-63c09691bfa0" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.371556 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c392507b-f994-40e8-aa01-63c09691bfa0" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: E1002 11:31:29.371632 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20069136-d22e-4c10-b7c1-a4a735c4aaf3" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.371700 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="20069136-d22e-4c10-b7c1-a4a735c4aaf3" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: E1002 11:31:29.371783 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4783768c-7664-4ff2-98a8-69e9d744462c" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.371855 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4783768c-7664-4ff2-98a8-69e9d744462c" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.372172 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="20069136-d22e-4c10-b7c1-a4a735c4aaf3" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.372273 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4783768c-7664-4ff2-98a8-69e9d744462c" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.372366 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c392507b-f994-40e8-aa01-63c09691bfa0" containerName="mariadb-database-create" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.372445 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-log" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.372534 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" containerName="glance-httpd" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.373816 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.380478 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.380818 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.396907 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.531910 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.533247 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.535117 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.535309 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.535447 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.535861 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.536021 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.536345 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fslgk\" (UniqueName: \"kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640087 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640353 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640378 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640473 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fslgk\" (UniqueName: \"kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640531 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.640865 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.641426 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.644943 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.645337 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.648000 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.649709 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.655928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.661598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.674059 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fslgk\" (UniqueName: \"kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.690618 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " pod="openstack/glance-default-external-api-0" Oct 02 11:31:29 crc kubenswrapper[4929]: I1002 11:31:29.697750 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.174746 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f4d876-4669-4a1f-b3c4-90f39131f726" path="/var/lib/kubelet/pods/31f4d876-4669-4a1f-b3c4-90f39131f726/volumes" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.311419 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.345667 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api-log" containerID="cri-o://b4c23593ffd88427329daed4cd6385647bd0a34bbdda04cb685bb9c7e76335c3" gracePeriod=30 Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.345752 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerStarted","Data":"653ef35682b76207b877dd55de5273bc6aa607c556d163f5c8887c299458307a"} Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.345789 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.346081 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api" containerID="cri-o://30e5a4ae31d94b664d18b31af6f668ff807fcdfc9dab241f49d3553f3335eafe" gracePeriod=30 Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.760352 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.795882 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.795862188 podStartE2EDuration="5.795862188s" podCreationTimestamp="2025-10-02 11:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:30.369583381 +0000 UTC m=+1290.919949755" watchObservedRunningTime="2025-10-02 11:31:30.795862188 +0000 UTC m=+1291.346228552" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881534 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881606 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgwfj\" (UniqueName: \"kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881659 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881708 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881734 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881790 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881813 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.881896 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run\") pod \"870cd509-a882-46d4-ae9d-c89ee843385a\" (UID: \"870cd509-a882-46d4-ae9d-c89ee843385a\") " Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.882794 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.883697 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs" (OuterVolumeSpecName: "logs") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.891140 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj" (OuterVolumeSpecName: "kube-api-access-tgwfj") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "kube-api-access-tgwfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.895083 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts" (OuterVolumeSpecName: "scripts") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.915439 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.990137 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.990444 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.990459 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgwfj\" (UniqueName: \"kubernetes.io/projected/870cd509-a882-46d4-ae9d-c89ee843385a-kube-api-access-tgwfj\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.990474 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870cd509-a882-46d4-ae9d-c89ee843385a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:30 crc kubenswrapper[4929]: I1002 11:31:30.990498 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.029252 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data" (OuterVolumeSpecName: "config-data") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.036923 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.066190 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.090315 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "870cd509-a882-46d4-ae9d-c89ee843385a" (UID: "870cd509-a882-46d4-ae9d-c89ee843385a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.099932 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.100250 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.100328 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870cd509-a882-46d4-ae9d-c89ee843385a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.100384 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.362928 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerStarted","Data":"1389318952a4ecdc871220633fb534613019d433f527dfac6df8c882a2e3adce"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.371773 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerStarted","Data":"6ffa84d6075b225d27a9c6883ad622967a84dcdf1ced587164a41107b03382a7"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.371843 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerStarted","Data":"1b0bf03eaf150366334d5891e2fd4d6512e69038bf20d60bcf9d1e7fda8855e2"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.387343 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerStarted","Data":"0e8f8533ada199772c7121378d7e1e900c80af1669d625c820c85321a406567f"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390497 4929 generic.go:334] "Generic (PLEG): container finished" podID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerID="30e5a4ae31d94b664d18b31af6f668ff807fcdfc9dab241f49d3553f3335eafe" exitCode=0 Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390523 4929 generic.go:334] "Generic (PLEG): container finished" podID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerID="b4c23593ffd88427329daed4cd6385647bd0a34bbdda04cb685bb9c7e76335c3" exitCode=143 Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390557 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerDied","Data":"30e5a4ae31d94b664d18b31af6f668ff807fcdfc9dab241f49d3553f3335eafe"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390576 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerDied","Data":"b4c23593ffd88427329daed4cd6385647bd0a34bbdda04cb685bb9c7e76335c3"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390586 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3e0bffcd-73e3-4b11-80ad-183b8028d97f","Type":"ContainerDied","Data":"896180156dd66d0a9a94027a59a3a82dfb9ebfc1b2a931bdfeb6109a59afae00"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.390633 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="896180156dd66d0a9a94027a59a3a82dfb9ebfc1b2a931bdfeb6109a59afae00" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.411254 4929 generic.go:334] "Generic (PLEG): container finished" podID="870cd509-a882-46d4-ae9d-c89ee843385a" containerID="25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7" exitCode=0 Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.411292 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerDied","Data":"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.411320 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"870cd509-a882-46d4-ae9d-c89ee843385a","Type":"ContainerDied","Data":"96cc5772dd3a996bc587e15ddbf1127a1e2b232cc033154460387f1b4dcc5c2f"} Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.411335 4929 scope.go:117] "RemoveContainer" containerID="25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.411456 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.429236 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.432794761 podStartE2EDuration="6.429220684s" podCreationTimestamp="2025-10-02 11:31:25 +0000 UTC" firstStartedPulling="2025-10-02 11:31:26.218047007 +0000 UTC m=+1286.768413371" lastFinishedPulling="2025-10-02 11:31:28.21447293 +0000 UTC m=+1288.764839294" observedRunningTime="2025-10-02 11:31:31.409994341 +0000 UTC m=+1291.960360735" watchObservedRunningTime="2025-10-02 11:31:31.429220684 +0000 UTC m=+1291.979587048" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.464268 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.469123 4929 scope.go:117] "RemoveContainer" containerID="10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.474605 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.521744 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.521872 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.522078 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.522101 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9nhw\" (UniqueName: \"kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.522152 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.522417 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.522473 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs\") pod \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\" (UID: \"3e0bffcd-73e3-4b11-80ad-183b8028d97f\") " Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.525567 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.528886 4929 scope.go:117] "RemoveContainer" containerID="25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.529827 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs" (OuterVolumeSpecName: "logs") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.529941 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7\": container with ID starting with 25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7 not found: ID does not exist" containerID="25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.529984 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7"} err="failed to get container status \"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7\": rpc error: code = NotFound desc = could not find container \"25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7\": container with ID starting with 25a28d3fd6c839dac6112d0b4c5710d854fed72cffd6b16091f81c0e168b60f7 not found: ID does not exist" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.530009 4929 scope.go:117] "RemoveContainer" containerID="10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.530726 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw" (OuterVolumeSpecName: "kube-api-access-n9nhw") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "kube-api-access-n9nhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.533627 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts" (OuterVolumeSpecName: "scripts") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.535348 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1\": container with ID starting with 10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1 not found: ID does not exist" containerID="10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.535402 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1"} err="failed to get container status \"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1\": rpc error: code = NotFound desc = could not find container \"10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1\": container with ID starting with 10b635a627dbd557cfa9bf5022ea88b5442253714eeb1f1f337553fb2b4e48c1 not found: ID does not exist" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.537617 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.553171 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557185 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.557526 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-httpd" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557541 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-httpd" Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.557557 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-log" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557564 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-log" Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.557573 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557579 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api" Oct 02 11:31:31 crc kubenswrapper[4929]: E1002 11:31:31.557601 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api-log" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557607 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api-log" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557771 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557789 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-httpd" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557811 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" containerName="cinder-api-log" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.557820 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" containerName="glance-log" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.558432 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.558967 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.560748 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.561270 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.582857 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.616076 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data" (OuterVolumeSpecName: "config-data") pod "3e0bffcd-73e3-4b11-80ad-183b8028d97f" (UID: "3e0bffcd-73e3-4b11-80ad-183b8028d97f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624475 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e0bffcd-73e3-4b11-80ad-183b8028d97f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624502 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624511 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624519 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624532 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9nhw\" (UniqueName: \"kubernetes.io/projected/3e0bffcd-73e3-4b11-80ad-183b8028d97f-kube-api-access-n9nhw\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624543 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e0bffcd-73e3-4b11-80ad-183b8028d97f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.624553 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e0bffcd-73e3-4b11-80ad-183b8028d97f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726062 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726613 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726679 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726698 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726761 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726794 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726846 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.726914 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g99h\" (UniqueName: \"kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.829312 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.829560 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g99h\" (UniqueName: \"kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.829632 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.831336 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.831547 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.831589 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.831752 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.831825 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.832575 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.832868 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.829807 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.839125 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.841732 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.843450 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.845337 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.851708 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g99h\" (UniqueName: \"kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.872790 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:31:31 crc kubenswrapper[4929]: I1002 11:31:31.901397 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.177022 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870cd509-a882-46d4-ae9d-c89ee843385a" path="/var/lib/kubelet/pods/870cd509-a882-46d4-ae9d-c89ee843385a/volumes" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.424613 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerStarted","Data":"41c95ee7304d226a10f65f9f4eb7f25c911fc6a7c2f8bd69eab49e70a8a3e99a"} Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.424980 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerStarted","Data":"baa739f57ecb397b52caaf29fc37695b9a4448c825f0be5520586e3d2f8dccf3"} Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.424637 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.448911 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.448896119 podStartE2EDuration="3.448896119s" podCreationTimestamp="2025-10-02 11:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:32.443604487 +0000 UTC m=+1292.993970851" watchObservedRunningTime="2025-10-02 11:31:32.448896119 +0000 UTC m=+1292.999262473" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.472214 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.487073 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.526121 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.592873 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.594423 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.603408 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.603720 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.604005 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.604788 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752322 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752402 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752443 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752528 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752558 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgm7\" (UniqueName: \"kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752578 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.752596 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854442 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854738 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854799 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854834 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854879 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgm7\" (UniqueName: \"kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854898 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854918 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.854979 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.855008 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.855176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.855244 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.859836 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.861176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.861233 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.863825 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.864634 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.865397 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:32 crc kubenswrapper[4929]: I1002 11:31:32.879483 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgm7\" (UniqueName: \"kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7\") pod \"cinder-api-0\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " pod="openstack/cinder-api-0" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.118851 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.256138 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.366716 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs\") pod \"c72eb1cc-3002-4941-878e-409ee9abeed1\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.366932 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config\") pod \"c72eb1cc-3002-4941-878e-409ee9abeed1\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.367014 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgv6t\" (UniqueName: \"kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t\") pod \"c72eb1cc-3002-4941-878e-409ee9abeed1\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.367047 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle\") pod \"c72eb1cc-3002-4941-878e-409ee9abeed1\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.367126 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config\") pod \"c72eb1cc-3002-4941-878e-409ee9abeed1\" (UID: \"c72eb1cc-3002-4941-878e-409ee9abeed1\") " Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.380219 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t" (OuterVolumeSpecName: "kube-api-access-fgv6t") pod "c72eb1cc-3002-4941-878e-409ee9abeed1" (UID: "c72eb1cc-3002-4941-878e-409ee9abeed1"). InnerVolumeSpecName "kube-api-access-fgv6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.383120 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c72eb1cc-3002-4941-878e-409ee9abeed1" (UID: "c72eb1cc-3002-4941-878e-409ee9abeed1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.433475 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config" (OuterVolumeSpecName: "config") pod "c72eb1cc-3002-4941-878e-409ee9abeed1" (UID: "c72eb1cc-3002-4941-878e-409ee9abeed1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.450449 4929 generic.go:334] "Generic (PLEG): container finished" podID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerID="8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd" exitCode=0 Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.450606 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d65bddd44-jz54h" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.450661 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerDied","Data":"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd"} Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.450711 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d65bddd44-jz54h" event={"ID":"c72eb1cc-3002-4941-878e-409ee9abeed1","Type":"ContainerDied","Data":"dfdcdd042f69f51d7e5b6a8fc8175f623f8593f9dbb166f1591da0878719d1a2"} Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.450732 4929 scope.go:117] "RemoveContainer" containerID="aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.452273 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerStarted","Data":"451546a934f38d915cbb04879f0147f97d390d02d97e32ed999e258f1445f92c"} Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.452309 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerStarted","Data":"9a173bc4afd6cc5b5d7e163fdc9b0e65bd13273638d86d4dad5442dfad5d3304"} Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.452840 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c72eb1cc-3002-4941-878e-409ee9abeed1" (UID: "c72eb1cc-3002-4941-878e-409ee9abeed1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.457386 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c72eb1cc-3002-4941-878e-409ee9abeed1" (UID: "c72eb1cc-3002-4941-878e-409ee9abeed1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.458493 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerStarted","Data":"80bfd45d586b9a6320e5eb7d06a1d12de4e11f7dc3cc374d6f14e024fb521aea"} Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.458600 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-central-agent" containerID="cri-o://4e38c20007754dd62f2e497dafa007c2636559fe685e183be5b7bb54601f7a80" gracePeriod=30 Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.459189 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="proxy-httpd" containerID="cri-o://80bfd45d586b9a6320e5eb7d06a1d12de4e11f7dc3cc374d6f14e024fb521aea" gracePeriod=30 Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.459257 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="sg-core" containerID="cri-o://6ffa84d6075b225d27a9c6883ad622967a84dcdf1ced587164a41107b03382a7" gracePeriod=30 Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.459301 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-notification-agent" containerID="cri-o://1b0bf03eaf150366334d5891e2fd4d6512e69038bf20d60bcf9d1e7fda8855e2" gracePeriod=30 Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.469625 4929 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.469658 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.469667 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgv6t\" (UniqueName: \"kubernetes.io/projected/c72eb1cc-3002-4941-878e-409ee9abeed1-kube-api-access-fgv6t\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.469676 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.469684 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c72eb1cc-3002-4941-878e-409ee9abeed1-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.476245 4929 scope.go:117] "RemoveContainer" containerID="8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.488819 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.061715295 podStartE2EDuration="7.488798796s" podCreationTimestamp="2025-10-02 11:31:26 +0000 UTC" firstStartedPulling="2025-10-02 11:31:27.257164491 +0000 UTC m=+1287.807530855" lastFinishedPulling="2025-10-02 11:31:32.684247992 +0000 UTC m=+1293.234614356" observedRunningTime="2025-10-02 11:31:33.476600095 +0000 UTC m=+1294.026966459" watchObservedRunningTime="2025-10-02 11:31:33.488798796 +0000 UTC m=+1294.039165160" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.503566 4929 scope.go:117] "RemoveContainer" containerID="aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d" Oct 02 11:31:33 crc kubenswrapper[4929]: E1002 11:31:33.504047 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d\": container with ID starting with aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d not found: ID does not exist" containerID="aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.504095 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d"} err="failed to get container status \"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d\": rpc error: code = NotFound desc = could not find container \"aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d\": container with ID starting with aca9819a328bd205e909b9e1c70cb6f60dd76ea580e0c91773929e8286ff6a1d not found: ID does not exist" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.504134 4929 scope.go:117] "RemoveContainer" containerID="8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd" Oct 02 11:31:33 crc kubenswrapper[4929]: E1002 11:31:33.504939 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd\": container with ID starting with 8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd not found: ID does not exist" containerID="8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.504981 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd"} err="failed to get container status \"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd\": rpc error: code = NotFound desc = could not find container \"8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd\": container with ID starting with 8a65f90eb34bd030503a40abdb6b63620522e86ba27d823fc7e3205f9033cdcd not found: ID does not exist" Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.628746 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:31:33 crc kubenswrapper[4929]: W1002 11:31:33.656552 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39949247_a1b3_41bc_a94a_4c59049576cd.slice/crio-67256d7f709b19a7c49467b7589a840b7969dd0bd524771819f830758a4e416d WatchSource:0}: Error finding container 67256d7f709b19a7c49467b7589a840b7969dd0bd524771819f830758a4e416d: Status 404 returned error can't find the container with id 67256d7f709b19a7c49467b7589a840b7969dd0bd524771819f830758a4e416d Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.790386 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:31:33 crc kubenswrapper[4929]: I1002 11:31:33.796332 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6d65bddd44-jz54h"] Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.185975 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e0bffcd-73e3-4b11-80ad-183b8028d97f" path="/var/lib/kubelet/pods/3e0bffcd-73e3-4b11-80ad-183b8028d97f/volumes" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.187361 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" path="/var/lib/kubelet/pods/c72eb1cc-3002-4941-878e-409ee9abeed1/volumes" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.469939 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerStarted","Data":"d65560d220b0508cdd18383acef57b7ee3e6f336bff1911b49d9df550f30b608"} Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.471500 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerStarted","Data":"67256d7f709b19a7c49467b7589a840b7969dd0bd524771819f830758a4e416d"} Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474180 4929 generic.go:334] "Generic (PLEG): container finished" podID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerID="80bfd45d586b9a6320e5eb7d06a1d12de4e11f7dc3cc374d6f14e024fb521aea" exitCode=0 Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474204 4929 generic.go:334] "Generic (PLEG): container finished" podID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerID="6ffa84d6075b225d27a9c6883ad622967a84dcdf1ced587164a41107b03382a7" exitCode=2 Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474212 4929 generic.go:334] "Generic (PLEG): container finished" podID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerID="1b0bf03eaf150366334d5891e2fd4d6512e69038bf20d60bcf9d1e7fda8855e2" exitCode=0 Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474260 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerDied","Data":"80bfd45d586b9a6320e5eb7d06a1d12de4e11f7dc3cc374d6f14e024fb521aea"} Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474319 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerDied","Data":"6ffa84d6075b225d27a9c6883ad622967a84dcdf1ced587164a41107b03382a7"} Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.474334 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerDied","Data":"1b0bf03eaf150366334d5891e2fd4d6512e69038bf20d60bcf9d1e7fda8855e2"} Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.499820 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.499793711 podStartE2EDuration="3.499793711s" podCreationTimestamp="2025-10-02 11:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:34.49003231 +0000 UTC m=+1295.040398704" watchObservedRunningTime="2025-10-02 11:31:34.499793711 +0000 UTC m=+1295.050160075" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.945150 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6c15-account-create-zstdt"] Oct 02 11:31:34 crc kubenswrapper[4929]: E1002 11:31:34.946064 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-httpd" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.946085 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-httpd" Oct 02 11:31:34 crc kubenswrapper[4929]: E1002 11:31:34.946119 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-api" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.946127 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-api" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.946351 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-api" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.946382 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72eb1cc-3002-4941-878e-409ee9abeed1" containerName="neutron-httpd" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.947358 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.949168 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 11:31:34 crc kubenswrapper[4929]: I1002 11:31:34.963054 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c15-account-create-zstdt"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.126645 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k478q\" (UniqueName: \"kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q\") pod \"nova-api-6c15-account-create-zstdt\" (UID: \"2cd43d17-114a-4e7b-92ff-0d7b422aa645\") " pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.144285 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-91e2-account-create-kh629"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.145631 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.149194 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.156320 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-91e2-account-create-kh629"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.227988 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k478q\" (UniqueName: \"kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q\") pod \"nova-api-6c15-account-create-zstdt\" (UID: \"2cd43d17-114a-4e7b-92ff-0d7b422aa645\") " pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.245674 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k478q\" (UniqueName: \"kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q\") pod \"nova-api-6c15-account-create-zstdt\" (UID: \"2cd43d17-114a-4e7b-92ff-0d7b422aa645\") " pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.329449 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b9e6-account-create-gtt87"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.332436 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.336873 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.338097 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.363506 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9e6-account-create-gtt87"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.368401 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8zl\" (UniqueName: \"kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl\") pod \"nova-cell0-91e2-account-create-kh629\" (UID: \"142b5038-3c60-46fa-bcdf-97a9ae30f2c2\") " pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.368489 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sfk2\" (UniqueName: \"kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2\") pod \"nova-cell1-b9e6-account-create-gtt87\" (UID: \"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2\") " pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.484925 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.485130 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8zl\" (UniqueName: \"kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl\") pod \"nova-cell0-91e2-account-create-kh629\" (UID: \"142b5038-3c60-46fa-bcdf-97a9ae30f2c2\") " pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.485192 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sfk2\" (UniqueName: \"kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2\") pod \"nova-cell1-b9e6-account-create-gtt87\" (UID: \"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2\") " pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.508191 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sfk2\" (UniqueName: \"kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2\") pod \"nova-cell1-b9e6-account-create-gtt87\" (UID: \"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2\") " pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.536673 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8zl\" (UniqueName: \"kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl\") pod \"nova-cell0-91e2-account-create-kh629\" (UID: \"142b5038-3c60-46fa-bcdf-97a9ae30f2c2\") " pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.576858 4929 generic.go:334] "Generic (PLEG): container finished" podID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerID="4e38c20007754dd62f2e497dafa007c2636559fe685e183be5b7bb54601f7a80" exitCode=0 Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.576933 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerDied","Data":"4e38c20007754dd62f2e497dafa007c2636559fe685e183be5b7bb54601f7a80"} Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.583561 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerStarted","Data":"00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9"} Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.745121 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.788166 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.789769 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.793009 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.809156 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.879854 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c15-account-create-zstdt"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.898823 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:31:35 crc kubenswrapper[4929]: I1002 11:31:35.899072 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="dnsmasq-dns" containerID="cri-o://166242403a9a523000d1d0c321aee5081ce107e0acc826bfcb01bb1992281633" gracePeriod=10 Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.057864 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098251 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098668 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098731 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd8cm\" (UniqueName: \"kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098872 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.098981 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.099069 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.099250 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data\") pod \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\" (UID: \"9750f97a-b1ec-4ff8-9964-b6fba3c7050d\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.103114 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.103403 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.113622 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm" (OuterVolumeSpecName: "kube-api-access-gd8cm") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "kube-api-access-gd8cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.152804 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts" (OuterVolumeSpecName: "scripts") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.205677 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd8cm\" (UniqueName: \"kubernetes.io/projected/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-kube-api-access-gd8cm\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.205717 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.205727 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.205735 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.347631 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.357803 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.417678 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.418028 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.424136 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data" (OuterVolumeSpecName: "config-data") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.461099 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9750f97a-b1ec-4ff8-9964-b6fba3c7050d" (UID: "9750f97a-b1ec-4ff8-9964-b6fba3c7050d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.526185 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.527246 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9750f97a-b1ec-4ff8-9964-b6fba3c7050d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.577599 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b9e6-account-create-gtt87"] Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.664919 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-91e2-account-create-kh629"] Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.688038 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9e6-account-create-gtt87" event={"ID":"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2","Type":"ContainerStarted","Data":"9fb42a89cbb6b58f2f7f8abc6a7dab523d7c6d83ac208f28c567474fb09d0969"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.693652 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerStarted","Data":"1369b86d88547970a8f877da10a92d101eafd5fa601f783fd300d42fabfef237"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.694119 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.720069 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.720039345 podStartE2EDuration="4.720039345s" podCreationTimestamp="2025-10-02 11:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:36.718440809 +0000 UTC m=+1297.268807173" watchObservedRunningTime="2025-10-02 11:31:36.720039345 +0000 UTC m=+1297.270405709" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.732150 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9750f97a-b1ec-4ff8-9964-b6fba3c7050d","Type":"ContainerDied","Data":"3265c54a4f57ac64adb9e2e5bd5d2acb111f7a5a5fdcb35e47f1b27d508b19fc"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.732201 4929 scope.go:117] "RemoveContainer" containerID="80bfd45d586b9a6320e5eb7d06a1d12de4e11f7dc3cc374d6f14e024fb521aea" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.732328 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.744042 4929 generic.go:334] "Generic (PLEG): container finished" podID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerID="166242403a9a523000d1d0c321aee5081ce107e0acc826bfcb01bb1992281633" exitCode=0 Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.744115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerDied","Data":"166242403a9a523000d1d0c321aee5081ce107e0acc826bfcb01bb1992281633"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.748527 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.748974 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="cinder-scheduler" containerID="cri-o://653ef35682b76207b877dd55de5273bc6aa607c556d163f5c8887c299458307a" gracePeriod=30 Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.749004 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c15-account-create-zstdt" event={"ID":"2cd43d17-114a-4e7b-92ff-0d7b422aa645","Type":"ContainerStarted","Data":"c096e2aa711406812066af32f9e379cdc47cd54789bd6714fe77bfadcd569f8e"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.749244 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c15-account-create-zstdt" event={"ID":"2cd43d17-114a-4e7b-92ff-0d7b422aa645","Type":"ContainerStarted","Data":"823c2a0b66d9435a176541cc3594367eb1ed3caa8d126ee970445caf609ea4b2"} Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.749363 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="probe" containerID="cri-o://0e8f8533ada199772c7121378d7e1e900c80af1669d625c820c85321a406567f" gracePeriod=30 Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.786200 4929 scope.go:117] "RemoveContainer" containerID="6ffa84d6075b225d27a9c6883ad622967a84dcdf1ced587164a41107b03382a7" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.794770 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.802446 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.831734 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6c15-account-create-zstdt" podStartSLOduration=2.831717629 podStartE2EDuration="2.831717629s" podCreationTimestamp="2025-10-02 11:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:36.814914375 +0000 UTC m=+1297.365280739" watchObservedRunningTime="2025-10-02 11:31:36.831717629 +0000 UTC m=+1297.382083993" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.840506 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841215 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-notification-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841235 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-notification-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841256 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="init" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841263 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="init" Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841280 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-central-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841288 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-central-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841301 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="dnsmasq-dns" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841306 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="dnsmasq-dns" Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841320 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="sg-core" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841326 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="sg-core" Oct 02 11:31:36 crc kubenswrapper[4929]: E1002 11:31:36.841339 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="proxy-httpd" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841345 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="proxy-httpd" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841525 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" containerName="dnsmasq-dns" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841540 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-central-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841552 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="ceilometer-notification-agent" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841567 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="proxy-httpd" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.841574 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" containerName="sg-core" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.842906 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxhtw\" (UniqueName: \"kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.843750 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.843894 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.843909 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.844574 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.844716 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.844783 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb\") pod \"045c9ef3-23a3-445f-8d6d-90233ceb023f\" (UID: \"045c9ef3-23a3-445f-8d6d-90233ceb023f\") " Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.849549 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.852595 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.854414 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.854634 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.861652 4929 scope.go:117] "RemoveContainer" containerID="1b0bf03eaf150366334d5891e2fd4d6512e69038bf20d60bcf9d1e7fda8855e2" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.861812 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw" (OuterVolumeSpecName: "kube-api-access-sxhtw") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "kube-api-access-sxhtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.899573 4929 scope.go:117] "RemoveContainer" containerID="4e38c20007754dd62f2e497dafa007c2636559fe685e183be5b7bb54601f7a80" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949229 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949291 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949369 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949393 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949417 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949450 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psffs\" (UniqueName: \"kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949465 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.949656 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.950002 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxhtw\" (UniqueName: \"kubernetes.io/projected/045c9ef3-23a3-445f-8d6d-90233ceb023f-kube-api-access-sxhtw\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.958427 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.968647 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config" (OuterVolumeSpecName: "config") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.971117 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.976690 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:36 crc kubenswrapper[4929]: I1002 11:31:36.978405 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "045c9ef3-23a3-445f-8d6d-90233ceb023f" (UID: "045c9ef3-23a3-445f-8d6d-90233ceb023f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051292 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051370 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051445 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051512 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psffs\" (UniqueName: \"kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051593 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051633 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051734 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051761 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051774 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051787 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.051797 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/045c9ef3-23a3-445f-8d6d-90233ceb023f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.053405 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.054172 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.057151 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.058477 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.060139 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.062241 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.064502 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.067439 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psffs\" (UniqueName: \"kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs\") pod \"ceilometer-0\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.175131 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.636016 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:37 crc kubenswrapper[4929]: W1002 11:31:37.678984 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2edee35_16dc_45bb_9273_a3c3de1b19b5.slice/crio-b183f72c3e7dad36bc2b15c76842c99ae0b8e1ef390bac4dbb5931420d167a51 WatchSource:0}: Error finding container b183f72c3e7dad36bc2b15c76842c99ae0b8e1ef390bac4dbb5931420d167a51: Status 404 returned error can't find the container with id b183f72c3e7dad36bc2b15c76842c99ae0b8e1ef390bac4dbb5931420d167a51 Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.784676 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerStarted","Data":"b183f72c3e7dad36bc2b15c76842c99ae0b8e1ef390bac4dbb5931420d167a51"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.788341 4929 generic.go:334] "Generic (PLEG): container finished" podID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerID="0e8f8533ada199772c7121378d7e1e900c80af1669d625c820c85321a406567f" exitCode=0 Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.788425 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerDied","Data":"0e8f8533ada199772c7121378d7e1e900c80af1669d625c820c85321a406567f"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.795600 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" event={"ID":"045c9ef3-23a3-445f-8d6d-90233ceb023f","Type":"ContainerDied","Data":"947ae1e71de670f1d0bb70c61a9fe3dd8f2d2d29259716b8ece947f379221a9d"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.795648 4929 scope.go:117] "RemoveContainer" containerID="166242403a9a523000d1d0c321aee5081ce107e0acc826bfcb01bb1992281633" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.795783 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rsr95" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.798605 4929 generic.go:334] "Generic (PLEG): container finished" podID="2cd43d17-114a-4e7b-92ff-0d7b422aa645" containerID="c096e2aa711406812066af32f9e379cdc47cd54789bd6714fe77bfadcd569f8e" exitCode=0 Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.798680 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c15-account-create-zstdt" event={"ID":"2cd43d17-114a-4e7b-92ff-0d7b422aa645","Type":"ContainerDied","Data":"c096e2aa711406812066af32f9e379cdc47cd54789bd6714fe77bfadcd569f8e"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.801508 4929 generic.go:334] "Generic (PLEG): container finished" podID="dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" containerID="2626d047aaf7ff7ef08925d134860a4d2f30e9567daf32e620de10d250ba21b3" exitCode=0 Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.801588 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9e6-account-create-gtt87" event={"ID":"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2","Type":"ContainerDied","Data":"2626d047aaf7ff7ef08925d134860a4d2f30e9567daf32e620de10d250ba21b3"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.807330 4929 generic.go:334] "Generic (PLEG): container finished" podID="142b5038-3c60-46fa-bcdf-97a9ae30f2c2" containerID="63be97525a116a20d685df7b368cdf81587e9c84bb1a7ebc7feba61f32b3fd4a" exitCode=0 Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.807381 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91e2-account-create-kh629" event={"ID":"142b5038-3c60-46fa-bcdf-97a9ae30f2c2","Type":"ContainerDied","Data":"63be97525a116a20d685df7b368cdf81587e9c84bb1a7ebc7feba61f32b3fd4a"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.807397 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91e2-account-create-kh629" event={"ID":"142b5038-3c60-46fa-bcdf-97a9ae30f2c2","Type":"ContainerStarted","Data":"5fcab340b7b021a8e97061337829c8adca23193c56bbe84f050350e8d6633b30"} Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.833027 4929 scope.go:117] "RemoveContainer" containerID="ed1263d66e888416b6de0df344c59ecb4c81d0133e4c19cd6307cc92dee18214" Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.881381 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.887554 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rsr95"] Oct 02 11:31:37 crc kubenswrapper[4929]: I1002 11:31:37.912135 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:31:38 crc kubenswrapper[4929]: I1002 11:31:38.168690 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045c9ef3-23a3-445f-8d6d-90233ceb023f" path="/var/lib/kubelet/pods/045c9ef3-23a3-445f-8d6d-90233ceb023f/volumes" Oct 02 11:31:38 crc kubenswrapper[4929]: I1002 11:31:38.170916 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9750f97a-b1ec-4ff8-9964-b6fba3c7050d" path="/var/lib/kubelet/pods/9750f97a-b1ec-4ff8-9964-b6fba3c7050d/volumes" Oct 02 11:31:38 crc kubenswrapper[4929]: I1002 11:31:38.827634 4929 generic.go:334] "Generic (PLEG): container finished" podID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerID="653ef35682b76207b877dd55de5273bc6aa607c556d163f5c8887c299458307a" exitCode=0 Oct 02 11:31:38 crc kubenswrapper[4929]: I1002 11:31:38.827758 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerDied","Data":"653ef35682b76207b877dd55de5273bc6aa607c556d163f5c8887c299458307a"} Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.231379 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.298864 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.298932 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.298987 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skc5b\" (UniqueName: \"kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.299052 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.299118 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.299192 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data\") pod \"9b308cee-10cf-4b64-87a5-47d42061f7ef\" (UID: \"9b308cee-10cf-4b64-87a5-47d42061f7ef\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.302642 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.308364 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b" (OuterVolumeSpecName: "kube-api-access-skc5b") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "kube-api-access-skc5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.308355 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.308524 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts" (OuterVolumeSpecName: "scripts") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.352915 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.396310 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data" (OuterVolumeSpecName: "config-data") pod "9b308cee-10cf-4b64-87a5-47d42061f7ef" (UID: "9b308cee-10cf-4b64-87a5-47d42061f7ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401745 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401784 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401798 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skc5b\" (UniqueName: \"kubernetes.io/projected/9b308cee-10cf-4b64-87a5-47d42061f7ef-kube-api-access-skc5b\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401813 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401825 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b308cee-10cf-4b64-87a5-47d42061f7ef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.401839 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b308cee-10cf-4b64-87a5-47d42061f7ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.402673 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.502562 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sfk2\" (UniqueName: \"kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2\") pod \"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2\" (UID: \"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.509095 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2" (OuterVolumeSpecName: "kube-api-access-6sfk2") pod "dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" (UID: "dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2"). InnerVolumeSpecName "kube-api-access-6sfk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.604695 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sfk2\" (UniqueName: \"kubernetes.io/projected/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2-kube-api-access-6sfk2\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.616112 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.621859 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.698407 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.698451 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.706809 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k478q\" (UniqueName: \"kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q\") pod \"2cd43d17-114a-4e7b-92ff-0d7b422aa645\" (UID: \"2cd43d17-114a-4e7b-92ff-0d7b422aa645\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.707088 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8zl\" (UniqueName: \"kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl\") pod \"142b5038-3c60-46fa-bcdf-97a9ae30f2c2\" (UID: \"142b5038-3c60-46fa-bcdf-97a9ae30f2c2\") " Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.710735 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q" (OuterVolumeSpecName: "kube-api-access-k478q") pod "2cd43d17-114a-4e7b-92ff-0d7b422aa645" (UID: "2cd43d17-114a-4e7b-92ff-0d7b422aa645"). InnerVolumeSpecName "kube-api-access-k478q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.710948 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl" (OuterVolumeSpecName: "kube-api-access-zp8zl") pod "142b5038-3c60-46fa-bcdf-97a9ae30f2c2" (UID: "142b5038-3c60-46fa-bcdf-97a9ae30f2c2"). InnerVolumeSpecName "kube-api-access-zp8zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.733035 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.740154 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.809011 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp8zl\" (UniqueName: \"kubernetes.io/projected/142b5038-3c60-46fa-bcdf-97a9ae30f2c2-kube-api-access-zp8zl\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.809036 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k478q\" (UniqueName: \"kubernetes.io/projected/2cd43d17-114a-4e7b-92ff-0d7b422aa645-kube-api-access-k478q\") on node \"crc\" DevicePath \"\"" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.844235 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c15-account-create-zstdt" event={"ID":"2cd43d17-114a-4e7b-92ff-0d7b422aa645","Type":"ContainerDied","Data":"823c2a0b66d9435a176541cc3594367eb1ed3caa8d126ee970445caf609ea4b2"} Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.844277 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823c2a0b66d9435a176541cc3594367eb1ed3caa8d126ee970445caf609ea4b2" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.844329 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c15-account-create-zstdt" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.847536 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b9e6-account-create-gtt87" event={"ID":"dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2","Type":"ContainerDied","Data":"9fb42a89cbb6b58f2f7f8abc6a7dab523d7c6d83ac208f28c567474fb09d0969"} Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.847573 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fb42a89cbb6b58f2f7f8abc6a7dab523d7c6d83ac208f28c567474fb09d0969" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.847592 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b9e6-account-create-gtt87" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.849266 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-91e2-account-create-kh629" event={"ID":"142b5038-3c60-46fa-bcdf-97a9ae30f2c2","Type":"ContainerDied","Data":"5fcab340b7b021a8e97061337829c8adca23193c56bbe84f050350e8d6633b30"} Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.849286 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fcab340b7b021a8e97061337829c8adca23193c56bbe84f050350e8d6633b30" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.849303 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-91e2-account-create-kh629" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.857972 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.858829 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b308cee-10cf-4b64-87a5-47d42061f7ef","Type":"ContainerDied","Data":"d49cb9270a1dcba01008ee178230cf7cbd43c387648a9db325ba9eaeea83ccf4"} Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.858900 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.858975 4929 scope.go:117] "RemoveContainer" containerID="0e8f8533ada199772c7121378d7e1e900c80af1669d625c820c85321a406567f" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.859206 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.933232 4929 scope.go:117] "RemoveContainer" containerID="653ef35682b76207b877dd55de5273bc6aa607c556d163f5c8887c299458307a" Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.976072 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:39 crc kubenswrapper[4929]: I1002 11:31:39.990661 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.001110 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:40 crc kubenswrapper[4929]: E1002 11:31:40.001843 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="cinder-scheduler" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.001937 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="cinder-scheduler" Oct 02 11:31:40 crc kubenswrapper[4929]: E1002 11:31:40.002025 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="probe" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.002087 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="probe" Oct 02 11:31:40 crc kubenswrapper[4929]: E1002 11:31:40.002194 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.002276 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: E1002 11:31:40.002354 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd43d17-114a-4e7b-92ff-0d7b422aa645" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.002421 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd43d17-114a-4e7b-92ff-0d7b422aa645" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: E1002 11:31:40.002514 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142b5038-3c60-46fa-bcdf-97a9ae30f2c2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.002588 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="142b5038-3c60-46fa-bcdf-97a9ae30f2c2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.002942 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="142b5038-3c60-46fa-bcdf-97a9ae30f2c2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.003114 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.003224 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd43d17-114a-4e7b-92ff-0d7b422aa645" containerName="mariadb-account-create" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.003305 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="cinder-scheduler" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.003375 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" containerName="probe" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.004516 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.007536 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.030760 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.123892 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.123992 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.124035 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.124060 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.124168 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghbgc\" (UniqueName: \"kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.124264 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.169011 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b308cee-10cf-4b64-87a5-47d42061f7ef" path="/var/lib/kubelet/pods/9b308cee-10cf-4b64-87a5-47d42061f7ef/volumes" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.227718 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228072 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228240 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228371 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228490 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghbgc\" (UniqueName: \"kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228625 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.228800 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.233511 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.234857 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.235285 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.236857 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.244540 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghbgc\" (UniqueName: \"kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc\") pod \"cinder-scheduler-0\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " pod="openstack/cinder-scheduler-0" Oct 02 11:31:40 crc kubenswrapper[4929]: I1002 11:31:40.319574 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.000874 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:31:41 crc kubenswrapper[4929]: W1002 11:31:41.004378 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace60114_0dd0_4f94_aad6_b1c2ace2c9d2.slice/crio-71f533e042ae6efd7bf87043aecb7cb5be035b0c3ddb6ef83dbced3369f110e3 WatchSource:0}: Error finding container 71f533e042ae6efd7bf87043aecb7cb5be035b0c3ddb6ef83dbced3369f110e3: Status 404 returned error can't find the container with id 71f533e042ae6efd7bf87043aecb7cb5be035b0c3ddb6ef83dbced3369f110e3 Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.672346 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.673069 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.879033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerStarted","Data":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.881665 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerStarted","Data":"71f533e042ae6efd7bf87043aecb7cb5be035b0c3ddb6ef83dbced3369f110e3"} Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.902590 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.902636 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.937431 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:41 crc kubenswrapper[4929]: I1002 11:31:41.958784 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:42 crc kubenswrapper[4929]: I1002 11:31:42.894948 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:42 crc kubenswrapper[4929]: I1002 11:31:42.895249 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:43 crc kubenswrapper[4929]: I1002 11:31:43.915334 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerStarted","Data":"e0f273d8b045c5750b84ae82eb3de630e392ed10f8ecf765920cb992fe5bf07b"} Oct 02 11:31:44 crc kubenswrapper[4929]: I1002 11:31:44.817565 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:44 crc kubenswrapper[4929]: I1002 11:31:44.824828 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.248376 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.886495 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7gbc"] Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.887843 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.895380 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qtwfl" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.895621 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.895759 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:31:45 crc kubenswrapper[4929]: I1002 11:31:45.896710 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7gbc"] Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.037038 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.037223 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.037262 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.037555 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c57xm\" (UniqueName: \"kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.140370 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.140515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.140551 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.140623 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c57xm\" (UniqueName: \"kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.149405 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.149532 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.149581 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.156625 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c57xm\" (UniqueName: \"kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm\") pod \"nova-cell0-conductor-db-sync-j7gbc\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.207477 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.659871 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7gbc"] Oct 02 11:31:46 crc kubenswrapper[4929]: W1002 11:31:46.672275 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b530bc3_0b1f_4607_9306_bba090124d3c.slice/crio-d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba WatchSource:0}: Error finding container d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba: Status 404 returned error can't find the container with id d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba Oct 02 11:31:46 crc kubenswrapper[4929]: I1002 11:31:46.955585 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" event={"ID":"9b530bc3-0b1f-4607-9306-bba090124d3c","Type":"ContainerStarted","Data":"d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba"} Oct 02 11:31:47 crc kubenswrapper[4929]: I1002 11:31:47.967587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerStarted","Data":"6b5c65593dc6d88e4e6e6322915fd952cd6ca9e932c894eb48b4bc84d07e1554"} Oct 02 11:31:51 crc kubenswrapper[4929]: I1002 11:31:51.022099 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.02208245 podStartE2EDuration="12.02208245s" podCreationTimestamp="2025-10-02 11:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:31:51.013916406 +0000 UTC m=+1311.564282760" watchObservedRunningTime="2025-10-02 11:31:51.02208245 +0000 UTC m=+1311.572448814" Oct 02 11:31:55 crc kubenswrapper[4929]: I1002 11:31:55.320260 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:31:55 crc kubenswrapper[4929]: I1002 11:31:55.529642 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:32:03 crc kubenswrapper[4929]: I1002 11:32:03.142478 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerStarted","Data":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} Oct 02 11:32:03 crc kubenswrapper[4929]: I1002 11:32:03.143819 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" event={"ID":"9b530bc3-0b1f-4607-9306-bba090124d3c","Type":"ContainerStarted","Data":"4be057dccda59231a48ae224c91812c2f935f4405234210c0ddc8c69220d1861"} Oct 02 11:32:03 crc kubenswrapper[4929]: I1002 11:32:03.157822 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" podStartSLOduration=4.800321049 podStartE2EDuration="18.157807151s" podCreationTimestamp="2025-10-02 11:31:45 +0000 UTC" firstStartedPulling="2025-10-02 11:31:46.676311877 +0000 UTC m=+1307.226678261" lastFinishedPulling="2025-10-02 11:32:00.033797999 +0000 UTC m=+1320.584164363" observedRunningTime="2025-10-02 11:32:03.155369011 +0000 UTC m=+1323.705735375" watchObservedRunningTime="2025-10-02 11:32:03.157807151 +0000 UTC m=+1323.708173515" Oct 02 11:32:04 crc kubenswrapper[4929]: I1002 11:32:04.173269 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerStarted","Data":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.187793 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerStarted","Data":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.188180 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.188015 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="sg-core" containerID="cri-o://b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" gracePeriod=30 Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.187967 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="proxy-httpd" containerID="cri-o://c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" gracePeriod=30 Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.188028 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-central-agent" containerID="cri-o://d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" gracePeriod=30 Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.188455 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-notification-agent" containerID="cri-o://7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" gracePeriod=30 Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.218204 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.406551109 podStartE2EDuration="30.218183316s" podCreationTimestamp="2025-10-02 11:31:36 +0000 UTC" firstStartedPulling="2025-10-02 11:31:37.681994508 +0000 UTC m=+1298.232360872" lastFinishedPulling="2025-10-02 11:32:05.493626715 +0000 UTC m=+1326.043993079" observedRunningTime="2025-10-02 11:32:06.216192169 +0000 UTC m=+1326.766558543" watchObservedRunningTime="2025-10-02 11:32:06.218183316 +0000 UTC m=+1326.768549680" Oct 02 11:32:06 crc kubenswrapper[4929]: I1002 11:32:06.927488 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064142 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064509 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064582 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064663 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psffs\" (UniqueName: \"kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064765 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064880 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.064945 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.065052 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data\") pod \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\" (UID: \"d2edee35-16dc-45bb-9273-a3c3de1b19b5\") " Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.067064 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.067341 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.070891 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs" (OuterVolumeSpecName: "kube-api-access-psffs") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "kube-api-access-psffs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.086182 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts" (OuterVolumeSpecName: "scripts") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.090478 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.111897 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.147349 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.166132 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data" (OuterVolumeSpecName: "config-data") pod "d2edee35-16dc-45bb-9273-a3c3de1b19b5" (UID: "d2edee35-16dc-45bb-9273-a3c3de1b19b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167189 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167282 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167349 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167410 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167464 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167526 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2edee35-16dc-45bb-9273-a3c3de1b19b5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167591 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psffs\" (UniqueName: \"kubernetes.io/projected/d2edee35-16dc-45bb-9273-a3c3de1b19b5-kube-api-access-psffs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.167649 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2edee35-16dc-45bb-9273-a3c3de1b19b5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.198966 4929 generic.go:334] "Generic (PLEG): container finished" podID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" exitCode=0 Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200459 4929 generic.go:334] "Generic (PLEG): container finished" podID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" exitCode=2 Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200542 4929 generic.go:334] "Generic (PLEG): container finished" podID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" exitCode=0 Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200625 4929 generic.go:334] "Generic (PLEG): container finished" podID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" exitCode=0 Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200438 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200419 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerDied","Data":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200936 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerDied","Data":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200954 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerDied","Data":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.200990 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerDied","Data":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.201002 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2edee35-16dc-45bb-9273-a3c3de1b19b5","Type":"ContainerDied","Data":"b183f72c3e7dad36bc2b15c76842c99ae0b8e1ef390bac4dbb5931420d167a51"} Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.201021 4929 scope.go:117] "RemoveContainer" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.223606 4929 scope.go:117] "RemoveContainer" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.253196 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.263728 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.263951 4929 scope.go:117] "RemoveContainer" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.281558 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.285013 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-central-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285042 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-central-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.285059 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-notification-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285066 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-notification-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.285083 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="sg-core" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285089 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="sg-core" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.285103 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="proxy-httpd" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285108 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="proxy-httpd" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285284 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-central-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285303 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="ceilometer-notification-agent" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285318 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="proxy-httpd" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.285328 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" containerName="sg-core" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.289352 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.291238 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.292882 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.293023 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.294747 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.300040 4929 scope.go:117] "RemoveContainer" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.332027 4929 scope.go:117] "RemoveContainer" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.332575 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": container with ID starting with c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8 not found: ID does not exist" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.332632 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} err="failed to get container status \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": rpc error: code = NotFound desc = could not find container \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": container with ID starting with c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.332665 4929 scope.go:117] "RemoveContainer" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.333125 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": container with ID starting with b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc not found: ID does not exist" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.333166 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} err="failed to get container status \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": rpc error: code = NotFound desc = could not find container \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": container with ID starting with b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.333195 4929 scope.go:117] "RemoveContainer" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.333537 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": container with ID starting with 7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7 not found: ID does not exist" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.333567 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} err="failed to get container status \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": rpc error: code = NotFound desc = could not find container \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": container with ID starting with 7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.333589 4929 scope.go:117] "RemoveContainer" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: E1002 11:32:07.333949 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": container with ID starting with d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d not found: ID does not exist" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.333978 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} err="failed to get container status \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": rpc error: code = NotFound desc = could not find container \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": container with ID starting with d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334010 4929 scope.go:117] "RemoveContainer" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334284 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} err="failed to get container status \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": rpc error: code = NotFound desc = could not find container \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": container with ID starting with c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334305 4929 scope.go:117] "RemoveContainer" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334545 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} err="failed to get container status \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": rpc error: code = NotFound desc = could not find container \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": container with ID starting with b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334562 4929 scope.go:117] "RemoveContainer" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334804 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} err="failed to get container status \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": rpc error: code = NotFound desc = could not find container \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": container with ID starting with 7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.334821 4929 scope.go:117] "RemoveContainer" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335232 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} err="failed to get container status \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": rpc error: code = NotFound desc = could not find container \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": container with ID starting with d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335284 4929 scope.go:117] "RemoveContainer" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335622 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} err="failed to get container status \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": rpc error: code = NotFound desc = could not find container \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": container with ID starting with c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335641 4929 scope.go:117] "RemoveContainer" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335904 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} err="failed to get container status \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": rpc error: code = NotFound desc = could not find container \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": container with ID starting with b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.335926 4929 scope.go:117] "RemoveContainer" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336170 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} err="failed to get container status \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": rpc error: code = NotFound desc = could not find container \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": container with ID starting with 7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336202 4929 scope.go:117] "RemoveContainer" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336463 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} err="failed to get container status \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": rpc error: code = NotFound desc = could not find container \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": container with ID starting with d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336491 4929 scope.go:117] "RemoveContainer" containerID="c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336765 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8"} err="failed to get container status \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": rpc error: code = NotFound desc = could not find container \"c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8\": container with ID starting with c08138b4d5047dc112b8a18f70ebea1d2878db98c2b23def0d3ab9dd818d49f8 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.336791 4929 scope.go:117] "RemoveContainer" containerID="b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.337047 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc"} err="failed to get container status \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": rpc error: code = NotFound desc = could not find container \"b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc\": container with ID starting with b6cf83e501161c9d9f51a40bfc1afdaf335af00ee1c416adab493f73fa4970fc not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.337069 4929 scope.go:117] "RemoveContainer" containerID="7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.337322 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7"} err="failed to get container status \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": rpc error: code = NotFound desc = could not find container \"7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7\": container with ID starting with 7cc2a3ff1117a94fd20d935aaa30dab769b20bd44090c5974746c368125556f7 not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.337342 4929 scope.go:117] "RemoveContainer" containerID="d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.337626 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d"} err="failed to get container status \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": rpc error: code = NotFound desc = could not find container \"d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d\": container with ID starting with d794dc0ac03acb055ad134582895633d215c3091cc7c509d1af96fcdc178691d not found: ID does not exist" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377443 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377503 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdpm\" (UniqueName: \"kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377529 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377694 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377772 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.377958 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.378061 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.378142 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479682 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdpm\" (UniqueName: \"kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479715 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479760 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479795 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479872 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479921 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.479973 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.480414 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.480451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.483923 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.484596 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.484759 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.484955 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.487223 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.496345 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdpm\" (UniqueName: \"kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm\") pod \"ceilometer-0\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " pod="openstack/ceilometer-0" Oct 02 11:32:07 crc kubenswrapper[4929]: I1002 11:32:07.621533 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:08 crc kubenswrapper[4929]: I1002 11:32:08.101134 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:08 crc kubenswrapper[4929]: W1002 11:32:08.113264 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c9245e_e681_489c_a8ba_fcec82f586d7.slice/crio-38dc254e217a4ba841e34a025b907dea95549cca843acc47379d062922d8f66b WatchSource:0}: Error finding container 38dc254e217a4ba841e34a025b907dea95549cca843acc47379d062922d8f66b: Status 404 returned error can't find the container with id 38dc254e217a4ba841e34a025b907dea95549cca843acc47379d062922d8f66b Oct 02 11:32:08 crc kubenswrapper[4929]: I1002 11:32:08.165678 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2edee35-16dc-45bb-9273-a3c3de1b19b5" path="/var/lib/kubelet/pods/d2edee35-16dc-45bb-9273-a3c3de1b19b5/volumes" Oct 02 11:32:08 crc kubenswrapper[4929]: I1002 11:32:08.209941 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerStarted","Data":"38dc254e217a4ba841e34a025b907dea95549cca843acc47379d062922d8f66b"} Oct 02 11:32:09 crc kubenswrapper[4929]: I1002 11:32:09.223358 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerStarted","Data":"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37"} Oct 02 11:32:09 crc kubenswrapper[4929]: I1002 11:32:09.386512 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:10 crc kubenswrapper[4929]: I1002 11:32:10.234213 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerStarted","Data":"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8"} Oct 02 11:32:11 crc kubenswrapper[4929]: I1002 11:32:11.244836 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerStarted","Data":"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9"} Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.254753 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerStarted","Data":"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f"} Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.255406 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-central-agent" containerID="cri-o://91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37" gracePeriod=30 Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.255483 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.255788 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="proxy-httpd" containerID="cri-o://ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f" gracePeriod=30 Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.255829 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="sg-core" containerID="cri-o://1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9" gracePeriod=30 Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.255862 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-notification-agent" containerID="cri-o://60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8" gracePeriod=30 Oct 02 11:32:12 crc kubenswrapper[4929]: I1002 11:32:12.288972 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8412956409999999 podStartE2EDuration="5.288944815s" podCreationTimestamp="2025-10-02 11:32:07 +0000 UTC" firstStartedPulling="2025-10-02 11:32:08.11501789 +0000 UTC m=+1328.665384254" lastFinishedPulling="2025-10-02 11:32:11.562667064 +0000 UTC m=+1332.113033428" observedRunningTime="2025-10-02 11:32:12.281550413 +0000 UTC m=+1332.831916777" watchObservedRunningTime="2025-10-02 11:32:12.288944815 +0000 UTC m=+1332.839311179" Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.263919 4929 generic.go:334] "Generic (PLEG): container finished" podID="9b530bc3-0b1f-4607-9306-bba090124d3c" containerID="4be057dccda59231a48ae224c91812c2f935f4405234210c0ddc8c69220d1861" exitCode=0 Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.264001 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" event={"ID":"9b530bc3-0b1f-4607-9306-bba090124d3c","Type":"ContainerDied","Data":"4be057dccda59231a48ae224c91812c2f935f4405234210c0ddc8c69220d1861"} Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267386 4929 generic.go:334] "Generic (PLEG): container finished" podID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerID="ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f" exitCode=0 Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267707 4929 generic.go:334] "Generic (PLEG): container finished" podID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerID="1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9" exitCode=2 Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267721 4929 generic.go:334] "Generic (PLEG): container finished" podID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerID="60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8" exitCode=0 Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerDied","Data":"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f"} Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267754 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerDied","Data":"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9"} Oct 02 11:32:13 crc kubenswrapper[4929]: I1002 11:32:13.267768 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerDied","Data":"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8"} Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.718360 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.863565 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle\") pod \"9b530bc3-0b1f-4607-9306-bba090124d3c\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.863722 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c57xm\" (UniqueName: \"kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm\") pod \"9b530bc3-0b1f-4607-9306-bba090124d3c\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.863956 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts\") pod \"9b530bc3-0b1f-4607-9306-bba090124d3c\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.864061 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data\") pod \"9b530bc3-0b1f-4607-9306-bba090124d3c\" (UID: \"9b530bc3-0b1f-4607-9306-bba090124d3c\") " Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.870835 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts" (OuterVolumeSpecName: "scripts") pod "9b530bc3-0b1f-4607-9306-bba090124d3c" (UID: "9b530bc3-0b1f-4607-9306-bba090124d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.870988 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm" (OuterVolumeSpecName: "kube-api-access-c57xm") pod "9b530bc3-0b1f-4607-9306-bba090124d3c" (UID: "9b530bc3-0b1f-4607-9306-bba090124d3c"). InnerVolumeSpecName "kube-api-access-c57xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.895041 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data" (OuterVolumeSpecName: "config-data") pod "9b530bc3-0b1f-4607-9306-bba090124d3c" (UID: "9b530bc3-0b1f-4607-9306-bba090124d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.899168 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b530bc3-0b1f-4607-9306-bba090124d3c" (UID: "9b530bc3-0b1f-4607-9306-bba090124d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.966610 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c57xm\" (UniqueName: \"kubernetes.io/projected/9b530bc3-0b1f-4607-9306-bba090124d3c-kube-api-access-c57xm\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.966655 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.966667 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:14 crc kubenswrapper[4929]: I1002 11:32:14.966681 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b530bc3-0b1f-4607-9306-bba090124d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.289780 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" event={"ID":"9b530bc3-0b1f-4607-9306-bba090124d3c","Type":"ContainerDied","Data":"d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba"} Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.289828 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9e692e1db929be29b0ffce546db5b73853e35be630061979f52c14dffd595ba" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.289838 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j7gbc" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.412088 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:32:15 crc kubenswrapper[4929]: E1002 11:32:15.412785 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b530bc3-0b1f-4607-9306-bba090124d3c" containerName="nova-cell0-conductor-db-sync" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.412906 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b530bc3-0b1f-4607-9306-bba090124d3c" containerName="nova-cell0-conductor-db-sync" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.413244 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b530bc3-0b1f-4607-9306-bba090124d3c" containerName="nova-cell0-conductor-db-sync" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.414048 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.417774 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qtwfl" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.417882 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.425734 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.577646 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h7g\" (UniqueName: \"kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.577680 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.577719 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.680169 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h7g\" (UniqueName: \"kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.680398 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.680505 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.686546 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.695271 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.725054 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h7g\" (UniqueName: \"kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g\") pod \"nova-cell0-conductor-0\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:15 crc kubenswrapper[4929]: I1002 11:32:15.729909 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.083032 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.187908 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.187943 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188057 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188105 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gdpm\" (UniqueName: \"kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188141 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188195 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188238 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188262 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle\") pod \"18c9245e-e681-489c-a8ba-fcec82f586d7\" (UID: \"18c9245e-e681-489c-a8ba-fcec82f586d7\") " Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188499 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.188526 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.189278 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.189302 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18c9245e-e681-489c-a8ba-fcec82f586d7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.192622 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts" (OuterVolumeSpecName: "scripts") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.192741 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm" (OuterVolumeSpecName: "kube-api-access-4gdpm") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "kube-api-access-4gdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.217311 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.237581 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:32:16 crc kubenswrapper[4929]: W1002 11:32:16.242172 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ec6412_e313_4ed7_ae20_d531571b5be6.slice/crio-881cd7701ce6b1028e61b958f203d5939358ab820c6a13c4b8012035c875db16 WatchSource:0}: Error finding container 881cd7701ce6b1028e61b958f203d5939358ab820c6a13c4b8012035c875db16: Status 404 returned error can't find the container with id 881cd7701ce6b1028e61b958f203d5939358ab820c6a13c4b8012035c875db16 Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.246951 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.284557 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.291017 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.291044 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.291053 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.291065 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.291074 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gdpm\" (UniqueName: \"kubernetes.io/projected/18c9245e-e681-489c-a8ba-fcec82f586d7-kube-api-access-4gdpm\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.305267 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data" (OuterVolumeSpecName: "config-data") pod "18c9245e-e681-489c-a8ba-fcec82f586d7" (UID: "18c9245e-e681-489c-a8ba-fcec82f586d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.309741 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95ec6412-e313-4ed7-ae20-d531571b5be6","Type":"ContainerStarted","Data":"881cd7701ce6b1028e61b958f203d5939358ab820c6a13c4b8012035c875db16"} Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.313826 4929 generic.go:334] "Generic (PLEG): container finished" podID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerID="91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37" exitCode=0 Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.313876 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerDied","Data":"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37"} Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.313906 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18c9245e-e681-489c-a8ba-fcec82f586d7","Type":"ContainerDied","Data":"38dc254e217a4ba841e34a025b907dea95549cca843acc47379d062922d8f66b"} Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.313926 4929 scope.go:117] "RemoveContainer" containerID="ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.314091 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.339365 4929 scope.go:117] "RemoveContainer" containerID="1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.342296 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.354140 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.365889 4929 scope.go:117] "RemoveContainer" containerID="60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373274 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.373681 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="proxy-httpd" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373695 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="proxy-httpd" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.373713 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="sg-core" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373719 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="sg-core" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.373744 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-central-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373750 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-central-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.373762 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-notification-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373767 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-notification-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373938 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="sg-core" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373960 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-central-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373983 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="proxy-httpd" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.373991 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" containerName="ceilometer-notification-agent" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.375593 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.377535 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.379074 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.381435 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.388909 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.392642 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c9245e-e681-489c-a8ba-fcec82f586d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.404764 4929 scope.go:117] "RemoveContainer" containerID="91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.424647 4929 scope.go:117] "RemoveContainer" containerID="ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.425024 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f\": container with ID starting with ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f not found: ID does not exist" containerID="ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425072 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f"} err="failed to get container status \"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f\": rpc error: code = NotFound desc = could not find container \"ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f\": container with ID starting with ef471d7726410300d34be12831e1c750c9d6efd113697eb75b3607fb0731670f not found: ID does not exist" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425102 4929 scope.go:117] "RemoveContainer" containerID="1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.425573 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9\": container with ID starting with 1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9 not found: ID does not exist" containerID="1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425607 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9"} err="failed to get container status \"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9\": rpc error: code = NotFound desc = could not find container \"1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9\": container with ID starting with 1cc575f5b1edd6327925955701819be45ecf0b9c10c645adba545cb57e00d1e9 not found: ID does not exist" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425625 4929 scope.go:117] "RemoveContainer" containerID="60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.425873 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8\": container with ID starting with 60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8 not found: ID does not exist" containerID="60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425904 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8"} err="failed to get container status \"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8\": rpc error: code = NotFound desc = could not find container \"60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8\": container with ID starting with 60350e7418a88d0d4cbe278e5b5b044946137950b9b44b14de2a334d5e6884a8 not found: ID does not exist" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.425924 4929 scope.go:117] "RemoveContainer" containerID="91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37" Oct 02 11:32:16 crc kubenswrapper[4929]: E1002 11:32:16.426194 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37\": container with ID starting with 91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37 not found: ID does not exist" containerID="91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.426216 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37"} err="failed to get container status \"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37\": rpc error: code = NotFound desc = could not find container \"91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37\": container with ID starting with 91a342095073a49518fdacadad1013554bc98480516bcb069f257e2353ae5b37 not found: ID does not exist" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494109 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494250 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2lk\" (UniqueName: \"kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494289 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494316 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494346 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494398 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494464 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.494487 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596156 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2lk\" (UniqueName: \"kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596209 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596230 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596252 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596289 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596354 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596388 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.596902 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.597090 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.601330 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.601504 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.601788 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.602586 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.609815 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.611557 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2lk\" (UniqueName: \"kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk\") pod \"ceilometer-0\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " pod="openstack/ceilometer-0" Oct 02 11:32:16 crc kubenswrapper[4929]: I1002 11:32:16.709379 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:32:17 crc kubenswrapper[4929]: I1002 11:32:17.187451 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:32:17 crc kubenswrapper[4929]: I1002 11:32:17.325240 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerStarted","Data":"1e6e3ef3e94ff4e9507c5fb38f0e4259341e1b0366694b2a146c4068f7cb1cb4"} Oct 02 11:32:17 crc kubenswrapper[4929]: I1002 11:32:17.327930 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95ec6412-e313-4ed7-ae20-d531571b5be6","Type":"ContainerStarted","Data":"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67"} Oct 02 11:32:17 crc kubenswrapper[4929]: I1002 11:32:17.329356 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:17 crc kubenswrapper[4929]: I1002 11:32:17.350302 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.350168188 podStartE2EDuration="2.350168188s" podCreationTimestamp="2025-10-02 11:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:17.345546986 +0000 UTC m=+1337.895913360" watchObservedRunningTime="2025-10-02 11:32:17.350168188 +0000 UTC m=+1337.900534552" Oct 02 11:32:18 crc kubenswrapper[4929]: I1002 11:32:18.168381 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c9245e-e681-489c-a8ba-fcec82f586d7" path="/var/lib/kubelet/pods/18c9245e-e681-489c-a8ba-fcec82f586d7/volumes" Oct 02 11:32:18 crc kubenswrapper[4929]: I1002 11:32:18.340754 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerStarted","Data":"378198d3bdaf31b39f05c2d9998458f2601f1eb2b7ed72301e1c3f101ebfc685"} Oct 02 11:32:19 crc kubenswrapper[4929]: I1002 11:32:19.355434 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerStarted","Data":"dece7c301f7b642703a0082328c976ccce23b7c915e41c9d02977815199f35bb"} Oct 02 11:32:20 crc kubenswrapper[4929]: I1002 11:32:20.365481 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerStarted","Data":"e9a6ecf6c8e71cb48a05fe5105a51cfc13f258d7e3bbe228ec11f1aa2ef040d2"} Oct 02 11:32:21 crc kubenswrapper[4929]: I1002 11:32:21.379739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerStarted","Data":"7475865ab42c5bb80c1c3b6feab81e267357d53bd3070ce6eccef6bf8fb07380"} Oct 02 11:32:21 crc kubenswrapper[4929]: I1002 11:32:21.380522 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:32:21 crc kubenswrapper[4929]: I1002 11:32:21.403224 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.031306594 podStartE2EDuration="5.403206055s" podCreationTimestamp="2025-10-02 11:32:16 +0000 UTC" firstStartedPulling="2025-10-02 11:32:17.185449524 +0000 UTC m=+1337.735815888" lastFinishedPulling="2025-10-02 11:32:20.557348985 +0000 UTC m=+1341.107715349" observedRunningTime="2025-10-02 11:32:21.400932799 +0000 UTC m=+1341.951299163" watchObservedRunningTime="2025-10-02 11:32:21.403206055 +0000 UTC m=+1341.953572419" Oct 02 11:32:25 crc kubenswrapper[4929]: I1002 11:32:25.773950 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.251434 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vmt6n"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.253129 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.255004 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.255577 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.261878 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vmt6n"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.371129 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.372840 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.380946 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.403759 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.403819 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.403904 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.403940 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwfh\" (UniqueName: \"kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.423275 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.451659 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.452793 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.456506 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.481456 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.506446 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.506499 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.506601 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.507523 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.507567 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.507611 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwfh\" (UniqueName: \"kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.507635 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sblsm\" (UniqueName: \"kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.507685 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.516869 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.516972 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.530764 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.563468 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwfh\" (UniqueName: \"kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh\") pod \"nova-cell0-cell-mapping-vmt6n\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.579063 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.605456 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.606895 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609156 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609219 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609271 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609289 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609319 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxtw\" (UniqueName: \"kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.609377 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sblsm\" (UniqueName: \"kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.610071 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.614620 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.635903 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.636519 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.655037 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.656745 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.665341 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.677574 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sblsm\" (UniqueName: \"kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm\") pod \"nova-api-0\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.677595 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.705516 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.725770 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.725854 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxtw\" (UniqueName: \"kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.725887 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.725936 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.726059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkj9\" (UniqueName: \"kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.726093 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.729925 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.733578 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.773426 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.801456 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxtw\" (UniqueName: \"kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw\") pod \"nova-cell1-novncproxy-0\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828342 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828441 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828513 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828569 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828594 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkj9\" (UniqueName: \"kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.828648 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln99c\" (UniqueName: \"kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.861397 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.872642 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.878747 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkj9\" (UniqueName: \"kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9\") pod \"nova-scheduler-0\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.893609 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.895420 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.930784 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln99c\" (UniqueName: \"kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.930891 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.930984 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.931005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.931697 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.935239 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.935324 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.959160 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln99c\" (UniqueName: \"kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c\") pod \"nova-metadata-0\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " pod="openstack/nova-metadata-0" Oct 02 11:32:26 crc kubenswrapper[4929]: I1002 11:32:26.991137 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032466 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032824 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032846 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032891 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032909 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvvv\" (UniqueName: \"kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.032989 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.072891 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.105820 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134306 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134396 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134426 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134508 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvvv\" (UniqueName: \"kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.134549 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.136434 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.137367 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.138893 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.138912 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.141292 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.154138 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.166418 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvvv\" (UniqueName: \"kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv\") pod \"dnsmasq-dns-bccf8f775-qkcqx\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.227202 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.418237 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.440546 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerStarted","Data":"bf822cbd82d6263af8c283aed57f183dc211d69652d74733cd15835493badc28"} Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.498012 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p5n6r"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.499451 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.503443 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.504398 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 11:32:27 crc kubenswrapper[4929]: W1002 11:32:27.505048 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f8a714_fde6_45a2_be8f_8655ab68bb45.slice/crio-bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2 WatchSource:0}: Error finding container bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2: Status 404 returned error can't find the container with id bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2 Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.508553 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vmt6n"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.519833 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p5n6r"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.645138 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.645247 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.645379 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rx6\" (UniqueName: \"kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.645499 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.690274 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.704342 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.747746 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.747808 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rx6\" (UniqueName: \"kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.747907 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.747972 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.754768 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.756115 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.761833 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.764197 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rx6\" (UniqueName: \"kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6\") pod \"nova-cell1-conductor-db-sync-p5n6r\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.859545 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.956360 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:32:27 crc kubenswrapper[4929]: W1002 11:32:27.958203 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03478cff_4797_4d9b_82f6_d5588149c889.slice/crio-e6485ec176d23ea4a7871568db6d85e9bc771304f1a751c94fc3a9980b0002da WatchSource:0}: Error finding container e6485ec176d23ea4a7871568db6d85e9bc771304f1a751c94fc3a9980b0002da: Status 404 returned error can't find the container with id e6485ec176d23ea4a7871568db6d85e9bc771304f1a751c94fc3a9980b0002da Oct 02 11:32:27 crc kubenswrapper[4929]: I1002 11:32:27.984466 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.455879 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p5n6r"] Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.457636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7199f069-ddfe-477d-9253-1ca638cf2732","Type":"ContainerStarted","Data":"2d9a7ef302ced74d06f582882aa56ee4fb08a3087e77273e2140a715f63e99c0"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.459919 4929 generic.go:334] "Generic (PLEG): container finished" podID="03478cff-4797-4d9b-82f6-d5588149c889" containerID="f12080b65b6bea1b6c3b5c515a3991fd750e4b7f9ea6cd5b70da2316dbab77cd" exitCode=0 Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.459973 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" event={"ID":"03478cff-4797-4d9b-82f6-d5588149c889","Type":"ContainerDied","Data":"f12080b65b6bea1b6c3b5c515a3991fd750e4b7f9ea6cd5b70da2316dbab77cd"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.459989 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" event={"ID":"03478cff-4797-4d9b-82f6-d5588149c889","Type":"ContainerStarted","Data":"e6485ec176d23ea4a7871568db6d85e9bc771304f1a751c94fc3a9980b0002da"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.475617 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vmt6n" event={"ID":"28f8a714-fde6-45a2-be8f-8655ab68bb45","Type":"ContainerStarted","Data":"bc25e8a77ed881d2d4de5ab990535881b586e9d4b601a6e9153ec40e5251d305"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.475831 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vmt6n" event={"ID":"28f8a714-fde6-45a2-be8f-8655ab68bb45","Type":"ContainerStarted","Data":"bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2"} Oct 02 11:32:28 crc kubenswrapper[4929]: W1002 11:32:28.479957 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3dc3d09_dbd9_4528_ab2f_17bb08d89d85.slice/crio-14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291 WatchSource:0}: Error finding container 14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291: Status 404 returned error can't find the container with id 14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291 Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.480744 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60c29e12-c8a9-4602-96d2-7a0e29857004","Type":"ContainerStarted","Data":"aa682ae2f55ae630d90461eb8fc8f581ed34f5eb473eed3a05405ae0dbff5764"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.486178 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerStarted","Data":"47e70a53246288cf0357ed169c69d1723a363fbc14c0c0fdcdd863cf019d7475"} Oct 02 11:32:28 crc kubenswrapper[4929]: I1002 11:32:28.499293 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vmt6n" podStartSLOduration=2.49927256 podStartE2EDuration="2.49927256s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:28.49610798 +0000 UTC m=+1349.046474344" watchObservedRunningTime="2025-10-02 11:32:28.49927256 +0000 UTC m=+1349.049638924" Oct 02 11:32:29 crc kubenswrapper[4929]: I1002 11:32:29.497585 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" event={"ID":"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85","Type":"ContainerStarted","Data":"14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.509564 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" event={"ID":"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85","Type":"ContainerStarted","Data":"acd44210c8619861169d2f06a85784af5612795c38734e9a5ca2e1173b57d742"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.530221 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" event={"ID":"03478cff-4797-4d9b-82f6-d5588149c889","Type":"ContainerStarted","Data":"a27c42cfbaf310401f831330bf840bc30a6282e10cdf25d8baa99d6fb094e34b"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.530329 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.539500 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerStarted","Data":"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.542422 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60c29e12-c8a9-4602-96d2-7a0e29857004","Type":"ContainerStarted","Data":"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.551005 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerStarted","Data":"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.552839 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7199f069-ddfe-477d-9253-1ca638cf2732","Type":"ContainerStarted","Data":"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1"} Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.558075 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" podStartSLOduration=4.55804732 podStartE2EDuration="4.55804732s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:30.551317217 +0000 UTC m=+1351.101683581" watchObservedRunningTime="2025-10-02 11:32:30.55804732 +0000 UTC m=+1351.108413704" Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.558251 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" podStartSLOduration=3.558243956 podStartE2EDuration="3.558243956s" podCreationTimestamp="2025-10-02 11:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:30.529221413 +0000 UTC m=+1351.079587777" watchObservedRunningTime="2025-10-02 11:32:30.558243956 +0000 UTC m=+1351.108610340" Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.586765 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.262943513 podStartE2EDuration="4.586733173s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="2025-10-02 11:32:27.697247468 +0000 UTC m=+1348.247613832" lastFinishedPulling="2025-10-02 11:32:30.021037128 +0000 UTC m=+1350.571403492" observedRunningTime="2025-10-02 11:32:30.566632166 +0000 UTC m=+1351.116998540" watchObservedRunningTime="2025-10-02 11:32:30.586733173 +0000 UTC m=+1351.137099547" Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.601269 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.287342303 podStartE2EDuration="4.601243279s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="2025-10-02 11:32:27.706864674 +0000 UTC m=+1348.257231038" lastFinishedPulling="2025-10-02 11:32:30.02076565 +0000 UTC m=+1350.571132014" observedRunningTime="2025-10-02 11:32:30.582577954 +0000 UTC m=+1351.132944328" watchObservedRunningTime="2025-10-02 11:32:30.601243279 +0000 UTC m=+1351.151609643" Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.701342 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:30 crc kubenswrapper[4929]: I1002 11:32:30.710111 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.564611 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerStarted","Data":"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03"} Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.565094 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-log" containerID="cri-o://9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" gracePeriod=30 Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.565548 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-metadata" containerID="cri-o://c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" gracePeriod=30 Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.568995 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerStarted","Data":"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5"} Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.597129 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.446298764 podStartE2EDuration="5.597107272s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="2025-10-02 11:32:27.869958682 +0000 UTC m=+1348.420325036" lastFinishedPulling="2025-10-02 11:32:30.02076719 +0000 UTC m=+1350.571133544" observedRunningTime="2025-10-02 11:32:31.588390582 +0000 UTC m=+1352.138756946" watchObservedRunningTime="2025-10-02 11:32:31.597107272 +0000 UTC m=+1352.147473636" Oct 02 11:32:31 crc kubenswrapper[4929]: I1002 11:32:31.612594 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.000792356 podStartE2EDuration="5.612579296s" podCreationTimestamp="2025-10-02 11:32:26 +0000 UTC" firstStartedPulling="2025-10-02 11:32:27.417376281 +0000 UTC m=+1347.967742645" lastFinishedPulling="2025-10-02 11:32:30.029163221 +0000 UTC m=+1350.579529585" observedRunningTime="2025-10-02 11:32:31.606667006 +0000 UTC m=+1352.157033390" watchObservedRunningTime="2025-10-02 11:32:31.612579296 +0000 UTC m=+1352.162945660" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.074223 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.106877 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.158871 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.235835 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle\") pod \"43b5dcf5-d689-47f9-8285-093dec40e8de\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.236088 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs\") pod \"43b5dcf5-d689-47f9-8285-093dec40e8de\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.236151 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln99c\" (UniqueName: \"kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c\") pod \"43b5dcf5-d689-47f9-8285-093dec40e8de\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.236181 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data\") pod \"43b5dcf5-d689-47f9-8285-093dec40e8de\" (UID: \"43b5dcf5-d689-47f9-8285-093dec40e8de\") " Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.236423 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs" (OuterVolumeSpecName: "logs") pod "43b5dcf5-d689-47f9-8285-093dec40e8de" (UID: "43b5dcf5-d689-47f9-8285-093dec40e8de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.237706 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b5dcf5-d689-47f9-8285-093dec40e8de-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.250179 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c" (OuterVolumeSpecName: "kube-api-access-ln99c") pod "43b5dcf5-d689-47f9-8285-093dec40e8de" (UID: "43b5dcf5-d689-47f9-8285-093dec40e8de"). InnerVolumeSpecName "kube-api-access-ln99c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.275686 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43b5dcf5-d689-47f9-8285-093dec40e8de" (UID: "43b5dcf5-d689-47f9-8285-093dec40e8de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.286106 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data" (OuterVolumeSpecName: "config-data") pod "43b5dcf5-d689-47f9-8285-093dec40e8de" (UID: "43b5dcf5-d689-47f9-8285-093dec40e8de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.339523 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.339559 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln99c\" (UniqueName: \"kubernetes.io/projected/43b5dcf5-d689-47f9-8285-093dec40e8de-kube-api-access-ln99c\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.339571 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43b5dcf5-d689-47f9-8285-093dec40e8de-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577782 4929 generic.go:334] "Generic (PLEG): container finished" podID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerID="c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" exitCode=0 Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577813 4929 generic.go:334] "Generic (PLEG): container finished" podID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerID="9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" exitCode=143 Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577835 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577883 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerDied","Data":"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03"} Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577910 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerDied","Data":"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3"} Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577923 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"43b5dcf5-d689-47f9-8285-093dec40e8de","Type":"ContainerDied","Data":"47e70a53246288cf0357ed169c69d1723a363fbc14c0c0fdcdd863cf019d7475"} Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.577941 4929 scope.go:117] "RemoveContainer" containerID="c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.578690 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="60c29e12-c8a9-4602-96d2-7a0e29857004" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c" gracePeriod=30 Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.637039 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.646133 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.649408 4929 scope.go:117] "RemoveContainer" containerID="9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.654489 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:32 crc kubenswrapper[4929]: E1002 11:32:32.654975 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-log" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.654995 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-log" Oct 02 11:32:32 crc kubenswrapper[4929]: E1002 11:32:32.655016 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-metadata" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.655023 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-metadata" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.655225 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-metadata" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.655249 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" containerName="nova-metadata-log" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.656703 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.661033 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.664488 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.669677 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.692268 4929 scope.go:117] "RemoveContainer" containerID="c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" Oct 02 11:32:32 crc kubenswrapper[4929]: E1002 11:32:32.692869 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03\": container with ID starting with c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03 not found: ID does not exist" containerID="c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.692928 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03"} err="failed to get container status \"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03\": rpc error: code = NotFound desc = could not find container \"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03\": container with ID starting with c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03 not found: ID does not exist" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.692950 4929 scope.go:117] "RemoveContainer" containerID="9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" Oct 02 11:32:32 crc kubenswrapper[4929]: E1002 11:32:32.693790 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3\": container with ID starting with 9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3 not found: ID does not exist" containerID="9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.693814 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3"} err="failed to get container status \"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3\": rpc error: code = NotFound desc = could not find container \"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3\": container with ID starting with 9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3 not found: ID does not exist" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.693829 4929 scope.go:117] "RemoveContainer" containerID="c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.694323 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03"} err="failed to get container status \"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03\": rpc error: code = NotFound desc = could not find container \"c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03\": container with ID starting with c69a99e319a7b6661f0cc229809eebf009c623f05754171c97b7e672d992cb03 not found: ID does not exist" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.694368 4929 scope.go:117] "RemoveContainer" containerID="9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.694681 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3"} err="failed to get container status \"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3\": rpc error: code = NotFound desc = could not find container \"9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3\": container with ID starting with 9f63fc79149450afb85cda527d361cb67952da7c3ac237f7c8b24429ef9132e3 not found: ID does not exist" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.749711 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kccrp\" (UniqueName: \"kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.749802 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.749855 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.749885 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.749906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.851491 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.851551 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.851615 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kccrp\" (UniqueName: \"kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.851672 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.851739 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.853371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.855932 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.856576 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.861394 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.870345 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kccrp\" (UniqueName: \"kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp\") pod \"nova-metadata-0\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " pod="openstack/nova-metadata-0" Oct 02 11:32:32 crc kubenswrapper[4929]: I1002 11:32:32.982642 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.429182 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.461868 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle\") pod \"60c29e12-c8a9-4602-96d2-7a0e29857004\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.461935 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxtw\" (UniqueName: \"kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw\") pod \"60c29e12-c8a9-4602-96d2-7a0e29857004\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.462085 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data\") pod \"60c29e12-c8a9-4602-96d2-7a0e29857004\" (UID: \"60c29e12-c8a9-4602-96d2-7a0e29857004\") " Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.485346 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw" (OuterVolumeSpecName: "kube-api-access-rqxtw") pod "60c29e12-c8a9-4602-96d2-7a0e29857004" (UID: "60c29e12-c8a9-4602-96d2-7a0e29857004"). InnerVolumeSpecName "kube-api-access-rqxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.489978 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data" (OuterVolumeSpecName: "config-data") pod "60c29e12-c8a9-4602-96d2-7a0e29857004" (UID: "60c29e12-c8a9-4602-96d2-7a0e29857004"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.491711 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60c29e12-c8a9-4602-96d2-7a0e29857004" (UID: "60c29e12-c8a9-4602-96d2-7a0e29857004"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.551230 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.564563 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.564589 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c29e12-c8a9-4602-96d2-7a0e29857004-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.564600 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxtw\" (UniqueName: \"kubernetes.io/projected/60c29e12-c8a9-4602-96d2-7a0e29857004-kube-api-access-rqxtw\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.588891 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerStarted","Data":"e52fcaa4f3709b7b096663d38f5f86b1eebc77a0db3700eaa05fea384b5433bc"} Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.591790 4929 generic.go:334] "Generic (PLEG): container finished" podID="60c29e12-c8a9-4602-96d2-7a0e29857004" containerID="a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c" exitCode=0 Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.591825 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60c29e12-c8a9-4602-96d2-7a0e29857004","Type":"ContainerDied","Data":"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c"} Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.591853 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"60c29e12-c8a9-4602-96d2-7a0e29857004","Type":"ContainerDied","Data":"aa682ae2f55ae630d90461eb8fc8f581ed34f5eb473eed3a05405ae0dbff5764"} Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.591872 4929 scope.go:117] "RemoveContainer" containerID="a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.592017 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.626306 4929 scope.go:117] "RemoveContainer" containerID="a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c" Oct 02 11:32:33 crc kubenswrapper[4929]: E1002 11:32:33.628349 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c\": container with ID starting with a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c not found: ID does not exist" containerID="a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.628376 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c"} err="failed to get container status \"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c\": rpc error: code = NotFound desc = could not find container \"a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c\": container with ID starting with a939237abd949a7426d8102482abdd6c29593295308e2de036a680eb2ae8f82c not found: ID does not exist" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.648604 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.655849 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.670155 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:33 crc kubenswrapper[4929]: E1002 11:32:33.670712 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c29e12-c8a9-4602-96d2-7a0e29857004" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.670737 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c29e12-c8a9-4602-96d2-7a0e29857004" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.670996 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c29e12-c8a9-4602-96d2-7a0e29857004" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.671879 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.674192 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.674504 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.676094 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.697504 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.768647 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6cz\" (UniqueName: \"kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.769183 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.769244 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.769272 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.769624 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.870896 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.870988 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6cz\" (UniqueName: \"kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.871036 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.871069 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.871083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.874577 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.874876 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.875099 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.878327 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.886844 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6cz\" (UniqueName: \"kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:33 crc kubenswrapper[4929]: I1002 11:32:33.989339 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.177942 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b5dcf5-d689-47f9-8285-093dec40e8de" path="/var/lib/kubelet/pods/43b5dcf5-d689-47f9-8285-093dec40e8de/volumes" Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.178876 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c29e12-c8a9-4602-96d2-7a0e29857004" path="/var/lib/kubelet/pods/60c29e12-c8a9-4602-96d2-7a0e29857004/volumes" Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.462922 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.600727 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e86f887d-db93-49c4-85ed-add5f01b25f7","Type":"ContainerStarted","Data":"3bf652b94e541d9dc200f08945d546a3a641fb201871293a8fa04fcc430f1df3"} Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.610518 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerStarted","Data":"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28"} Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.610575 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerStarted","Data":"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe"} Oct 02 11:32:34 crc kubenswrapper[4929]: I1002 11:32:34.640523 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.640502141 podStartE2EDuration="2.640502141s" podCreationTimestamp="2025-10-02 11:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:34.628338102 +0000 UTC m=+1355.178704466" watchObservedRunningTime="2025-10-02 11:32:34.640502141 +0000 UTC m=+1355.190868505" Oct 02 11:32:35 crc kubenswrapper[4929]: I1002 11:32:35.619881 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e86f887d-db93-49c4-85ed-add5f01b25f7","Type":"ContainerStarted","Data":"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557"} Oct 02 11:32:35 crc kubenswrapper[4929]: I1002 11:32:35.622398 4929 generic.go:334] "Generic (PLEG): container finished" podID="28f8a714-fde6-45a2-be8f-8655ab68bb45" containerID="bc25e8a77ed881d2d4de5ab990535881b586e9d4b601a6e9153ec40e5251d305" exitCode=0 Oct 02 11:32:35 crc kubenswrapper[4929]: I1002 11:32:35.622477 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vmt6n" event={"ID":"28f8a714-fde6-45a2-be8f-8655ab68bb45","Type":"ContainerDied","Data":"bc25e8a77ed881d2d4de5ab990535881b586e9d4b601a6e9153ec40e5251d305"} Oct 02 11:32:35 crc kubenswrapper[4929]: I1002 11:32:35.653970 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.653940097 podStartE2EDuration="2.653940097s" podCreationTimestamp="2025-10-02 11:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:35.645945168 +0000 UTC m=+1356.196311542" watchObservedRunningTime="2025-10-02 11:32:35.653940097 +0000 UTC m=+1356.204306461" Oct 02 11:32:36 crc kubenswrapper[4929]: I1002 11:32:36.707122 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:32:36 crc kubenswrapper[4929]: I1002 11:32:36.707221 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.053699 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.107192 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.153848 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data\") pod \"28f8a714-fde6-45a2-be8f-8655ab68bb45\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.153916 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts\") pod \"28f8a714-fde6-45a2-be8f-8655ab68bb45\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.154034 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdwfh\" (UniqueName: \"kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh\") pod \"28f8a714-fde6-45a2-be8f-8655ab68bb45\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.154173 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle\") pod \"28f8a714-fde6-45a2-be8f-8655ab68bb45\" (UID: \"28f8a714-fde6-45a2-be8f-8655ab68bb45\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.167154 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh" (OuterVolumeSpecName: "kube-api-access-mdwfh") pod "28f8a714-fde6-45a2-be8f-8655ab68bb45" (UID: "28f8a714-fde6-45a2-be8f-8655ab68bb45"). InnerVolumeSpecName "kube-api-access-mdwfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.190094 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts" (OuterVolumeSpecName: "scripts") pod "28f8a714-fde6-45a2-be8f-8655ab68bb45" (UID: "28f8a714-fde6-45a2-be8f-8655ab68bb45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.195018 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.206188 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data" (OuterVolumeSpecName: "config-data") pod "28f8a714-fde6-45a2-be8f-8655ab68bb45" (UID: "28f8a714-fde6-45a2-be8f-8655ab68bb45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.213878 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28f8a714-fde6-45a2-be8f-8655ab68bb45" (UID: "28f8a714-fde6-45a2-be8f-8655ab68bb45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.229039 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.256478 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.256512 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.256521 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f8a714-fde6-45a2-be8f-8655ab68bb45-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.256541 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdwfh\" (UniqueName: \"kubernetes.io/projected/28f8a714-fde6-45a2-be8f-8655ab68bb45-kube-api-access-mdwfh\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.313399 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.313699 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="dnsmasq-dns" containerID="cri-o://db29f1886411040e4a08ceb2843d594405344a82aa0e2fab095af2cdffa5fa9c" gracePeriod=10 Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.648337 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vmt6n" event={"ID":"28f8a714-fde6-45a2-be8f-8655ab68bb45","Type":"ContainerDied","Data":"bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2"} Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.648373 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6a510e733a18e5f1796a3171acd245e10ce7e161f273e2b1120a532dfe32d2" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.648429 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vmt6n" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.663315 4929 generic.go:334] "Generic (PLEG): container finished" podID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerID="db29f1886411040e4a08ceb2843d594405344a82aa0e2fab095af2cdffa5fa9c" exitCode=0 Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.663355 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" event={"ID":"7ca7e74a-8ca4-4657-82ed-22cecb4a9267","Type":"ContainerDied","Data":"db29f1886411040e4a08ceb2843d594405344a82aa0e2fab095af2cdffa5fa9c"} Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.732170 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.796108 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.796161 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.802799 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.878835 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkwdf\" (UniqueName: \"kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.878898 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.879024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.879062 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.879133 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.879210 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc\") pod \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\" (UID: \"7ca7e74a-8ca4-4657-82ed-22cecb4a9267\") " Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.912556 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf" (OuterVolumeSpecName: "kube-api-access-hkwdf") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "kube-api-access-hkwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.940696 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.940913 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-log" containerID="cri-o://415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7" gracePeriod=30 Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.941393 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-api" containerID="cri-o://0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5" gracePeriod=30 Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.982606 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkwdf\" (UniqueName: \"kubernetes.io/projected/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-kube-api-access-hkwdf\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.984195 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.985429 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:32:37 crc kubenswrapper[4929]: I1002 11:32:37.997385 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.002226 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.016144 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config" (OuterVolumeSpecName: "config") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.034424 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.034658 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.046284 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ca7e74a-8ca4-4657-82ed-22cecb4a9267" (UID: "7ca7e74a-8ca4-4657-82ed-22cecb4a9267"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.084627 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.084666 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.084683 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.084695 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.085109 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ca7e74a-8ca4-4657-82ed-22cecb4a9267-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.293360 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.674732 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.676052 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-t56n4" event={"ID":"7ca7e74a-8ca4-4657-82ed-22cecb4a9267","Type":"ContainerDied","Data":"968e06ee8c084fdf4692fb0b3a4181ac1027be7fe1a467987aea8831ef52a321"} Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.676112 4929 scope.go:117] "RemoveContainer" containerID="db29f1886411040e4a08ceb2843d594405344a82aa0e2fab095af2cdffa5fa9c" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.679449 4929 generic.go:334] "Generic (PLEG): container finished" podID="c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" containerID="acd44210c8619861169d2f06a85784af5612795c38734e9a5ca2e1173b57d742" exitCode=0 Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.679516 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" event={"ID":"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85","Type":"ContainerDied","Data":"acd44210c8619861169d2f06a85784af5612795c38734e9a5ca2e1173b57d742"} Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.684186 4929 generic.go:334] "Generic (PLEG): container finished" podID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerID="415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7" exitCode=143 Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.684270 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerDied","Data":"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7"} Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.703406 4929 scope.go:117] "RemoveContainer" containerID="0efbfb7e343f52067e17462a768ba0400daf0b304b0003a143ae764810c04a89" Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.745410 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.756660 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-t56n4"] Oct 02 11:32:38 crc kubenswrapper[4929]: I1002 11:32:38.990296 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:39 crc kubenswrapper[4929]: I1002 11:32:39.696210 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-log" containerID="cri-o://ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" gracePeriod=30 Oct 02 11:32:39 crc kubenswrapper[4929]: I1002 11:32:39.696923 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" containerName="nova-scheduler-scheduler" containerID="cri-o://5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" gracePeriod=30 Oct 02 11:32:39 crc kubenswrapper[4929]: I1002 11:32:39.696929 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-metadata" containerID="cri-o://5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" gracePeriod=30 Oct 02 11:32:39 crc kubenswrapper[4929]: E1002 11:32:39.923486 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e05aad5_360f_4da4_a1de_86327abf0a5b.slice/crio-conmon-5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.099535 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.168318 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" path="/var/lib/kubelet/pods/7ca7e74a-8ca4-4657-82ed-22cecb4a9267/volumes" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.238446 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle\") pod \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.238910 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6rx6\" (UniqueName: \"kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6\") pod \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.239031 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data\") pod \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.239050 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts\") pod \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\" (UID: \"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.244801 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts" (OuterVolumeSpecName: "scripts") pod "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" (UID: "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.245047 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6" (OuterVolumeSpecName: "kube-api-access-d6rx6") pod "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" (UID: "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85"). InnerVolumeSpecName "kube-api-access-d6rx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.267118 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data" (OuterVolumeSpecName: "config-data") pod "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" (UID: "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.273695 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" (UID: "c3dc3d09-dbd9-4528-ab2f-17bb08d89d85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.300488 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.343844 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6rx6\" (UniqueName: \"kubernetes.io/projected/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-kube-api-access-d6rx6\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.344118 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.344179 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.344234 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.445246 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kccrp\" (UniqueName: \"kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp\") pod \"9e05aad5-360f-4da4-a1de-86327abf0a5b\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.445787 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs\") pod \"9e05aad5-360f-4da4-a1de-86327abf0a5b\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.445986 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle\") pod \"9e05aad5-360f-4da4-a1de-86327abf0a5b\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.446101 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs" (OuterVolumeSpecName: "logs") pod "9e05aad5-360f-4da4-a1de-86327abf0a5b" (UID: "9e05aad5-360f-4da4-a1de-86327abf0a5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.446118 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data\") pod \"9e05aad5-360f-4da4-a1de-86327abf0a5b\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.446291 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs\") pod \"9e05aad5-360f-4da4-a1de-86327abf0a5b\" (UID: \"9e05aad5-360f-4da4-a1de-86327abf0a5b\") " Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.446734 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e05aad5-360f-4da4-a1de-86327abf0a5b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.448678 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp" (OuterVolumeSpecName: "kube-api-access-kccrp") pod "9e05aad5-360f-4da4-a1de-86327abf0a5b" (UID: "9e05aad5-360f-4da4-a1de-86327abf0a5b"). InnerVolumeSpecName "kube-api-access-kccrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.469492 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data" (OuterVolumeSpecName: "config-data") pod "9e05aad5-360f-4da4-a1de-86327abf0a5b" (UID: "9e05aad5-360f-4da4-a1de-86327abf0a5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.475812 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e05aad5-360f-4da4-a1de-86327abf0a5b" (UID: "9e05aad5-360f-4da4-a1de-86327abf0a5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.491411 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9e05aad5-360f-4da4-a1de-86327abf0a5b" (UID: "9e05aad5-360f-4da4-a1de-86327abf0a5b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.548676 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.548712 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.548722 4929 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e05aad5-360f-4da4-a1de-86327abf0a5b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.548732 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kccrp\" (UniqueName: \"kubernetes.io/projected/9e05aad5-360f-4da4-a1de-86327abf0a5b-kube-api-access-kccrp\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.708992 4929 generic.go:334] "Generic (PLEG): container finished" podID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerID="5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" exitCode=0 Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709039 4929 generic.go:334] "Generic (PLEG): container finished" podID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerID="ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" exitCode=143 Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709081 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerDied","Data":"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28"} Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709143 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerDied","Data":"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe"} Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709158 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e05aad5-360f-4da4-a1de-86327abf0a5b","Type":"ContainerDied","Data":"e52fcaa4f3709b7b096663d38f5f86b1eebc77a0db3700eaa05fea384b5433bc"} Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709180 4929 scope.go:117] "RemoveContainer" containerID="5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.709502 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.712374 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" event={"ID":"c3dc3d09-dbd9-4528-ab2f-17bb08d89d85","Type":"ContainerDied","Data":"14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291"} Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.712404 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14defab846bb73f081052d56f2981b51a6f88d482a71b46bba7128b13f159291" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.712490 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p5n6r" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.805624 4929 scope.go:117] "RemoveContainer" containerID="ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.841383 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.849316 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.859546 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860562 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f8a714-fde6-45a2-be8f-8655ab68bb45" containerName="nova-manage" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860600 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f8a714-fde6-45a2-be8f-8655ab68bb45" containerName="nova-manage" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860620 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" containerName="nova-cell1-conductor-db-sync" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860630 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" containerName="nova-cell1-conductor-db-sync" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860644 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="init" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860651 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="init" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860681 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="dnsmasq-dns" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860688 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="dnsmasq-dns" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860708 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-metadata" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860716 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-metadata" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.860744 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-log" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860752 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-log" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860943 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca7e74a-8ca4-4657-82ed-22cecb4a9267" containerName="dnsmasq-dns" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.860984 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" containerName="nova-cell1-conductor-db-sync" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.861012 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-metadata" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.861030 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f8a714-fde6-45a2-be8f-8655ab68bb45" containerName="nova-manage" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.861047 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" containerName="nova-metadata-log" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.861847 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.864236 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.876801 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.878297 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.880305 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.880441 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888037 4929 scope.go:117] "RemoveContainer" containerID="5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888291 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.888380 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28\": container with ID starting with 5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28 not found: ID does not exist" containerID="5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888431 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28"} err="failed to get container status \"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28\": rpc error: code = NotFound desc = could not find container \"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28\": container with ID starting with 5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28 not found: ID does not exist" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888464 4929 scope.go:117] "RemoveContainer" containerID="ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" Oct 02 11:32:40 crc kubenswrapper[4929]: E1002 11:32:40.888749 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe\": container with ID starting with ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe not found: ID does not exist" containerID="ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888780 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe"} err="failed to get container status \"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe\": rpc error: code = NotFound desc = could not find container \"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe\": container with ID starting with ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe not found: ID does not exist" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.888798 4929 scope.go:117] "RemoveContainer" containerID="5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.889248 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28"} err="failed to get container status \"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28\": rpc error: code = NotFound desc = could not find container \"5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28\": container with ID starting with 5a86516551fe41b45f1e6134ac328d0b730283cb0773a328c7f8ee7d327b8f28 not found: ID does not exist" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.889286 4929 scope.go:117] "RemoveContainer" containerID="ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.889500 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe"} err="failed to get container status \"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe\": rpc error: code = NotFound desc = could not find container \"ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe\": container with ID starting with ac8a5e6b3c25f54ee67a45ff95bd79ee2fd56a01fe1ec6b5e70d4ad94661ecbe not found: ID does not exist" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.900836 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.963991 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt9mg\" (UniqueName: \"kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964055 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964164 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964196 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxj6\" (UniqueName: \"kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964222 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964246 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964309 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:40 crc kubenswrapper[4929]: I1002 11:32:40.964338 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.065994 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt9mg\" (UniqueName: \"kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.066081 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.066151 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.066202 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxj6\" (UniqueName: \"kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.066230 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.066524 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.067157 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.067203 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.067810 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.072023 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.072620 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.074376 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.076251 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.079399 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.087585 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxj6\" (UniqueName: \"kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6\") pod \"nova-metadata-0\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.087820 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt9mg\" (UniqueName: \"kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg\") pod \"nova-cell1-conductor-0\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.189657 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.200860 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.661299 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.725304 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8b9fa36-f990-4cce-9544-23828715aa54","Type":"ContainerStarted","Data":"e10e2cca3042977bed86cea313d507d05c357b429279eaa46c4153299153c684"} Oct 02 11:32:41 crc kubenswrapper[4929]: I1002 11:32:41.728227 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.107042 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 is running failed: container process not found" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.107738 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 is running failed: container process not found" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.108115 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 is running failed: container process not found" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.108149 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" containerName="nova-scheduler-scheduler" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.168774 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e05aad5-360f-4da4-a1de-86327abf0a5b" path="/var/lib/kubelet/pods/9e05aad5-360f-4da4-a1de-86327abf0a5b/volumes" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.200762 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.288325 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pkj9\" (UniqueName: \"kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9\") pod \"7199f069-ddfe-477d-9253-1ca638cf2732\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.288824 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle\") pod \"7199f069-ddfe-477d-9253-1ca638cf2732\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.288891 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data\") pod \"7199f069-ddfe-477d-9253-1ca638cf2732\" (UID: \"7199f069-ddfe-477d-9253-1ca638cf2732\") " Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.301679 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9" (OuterVolumeSpecName: "kube-api-access-5pkj9") pod "7199f069-ddfe-477d-9253-1ca638cf2732" (UID: "7199f069-ddfe-477d-9253-1ca638cf2732"). InnerVolumeSpecName "kube-api-access-5pkj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.320544 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data" (OuterVolumeSpecName: "config-data") pod "7199f069-ddfe-477d-9253-1ca638cf2732" (UID: "7199f069-ddfe-477d-9253-1ca638cf2732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.324176 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7199f069-ddfe-477d-9253-1ca638cf2732" (UID: "7199f069-ddfe-477d-9253-1ca638cf2732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.391200 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pkj9\" (UniqueName: \"kubernetes.io/projected/7199f069-ddfe-477d-9253-1ca638cf2732-kube-api-access-5pkj9\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.391246 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.391259 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7199f069-ddfe-477d-9253-1ca638cf2732-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.738785 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8b9fa36-f990-4cce-9544-23828715aa54","Type":"ContainerStarted","Data":"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.739125 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.740743 4929 generic.go:334] "Generic (PLEG): container finished" podID="7199f069-ddfe-477d-9253-1ca638cf2732" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" exitCode=0 Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.740800 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7199f069-ddfe-477d-9253-1ca638cf2732","Type":"ContainerDied","Data":"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.740818 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7199f069-ddfe-477d-9253-1ca638cf2732","Type":"ContainerDied","Data":"2d9a7ef302ced74d06f582882aa56ee4fb08a3087e77273e2140a715f63e99c0"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.740836 4929 scope.go:117] "RemoveContainer" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.740933 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.759621 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerStarted","Data":"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.759698 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerStarted","Data":"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.759712 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerStarted","Data":"b61993ff6851b34a15bb4fda5ac24f1217d1c62834f6dbd5e1d83c7d36db1d60"} Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.781559 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.781514946 podStartE2EDuration="2.781514946s" podCreationTimestamp="2025-10-02 11:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:42.776167483 +0000 UTC m=+1363.326533847" watchObservedRunningTime="2025-10-02 11:32:42.781514946 +0000 UTC m=+1363.331881310" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.788287 4929 scope.go:117] "RemoveContainer" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.791141 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1\": container with ID starting with 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 not found: ID does not exist" containerID="5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.791180 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1"} err="failed to get container status \"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1\": rpc error: code = NotFound desc = could not find container \"5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1\": container with ID starting with 5e8b59c7a61a65b015ee6450c23953b93372f9a34f3af2e3210571bc4e38a3b1 not found: ID does not exist" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.807454 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.820153 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.831652 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.832203 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.83218874 podStartE2EDuration="2.83218874s" podCreationTimestamp="2025-10-02 11:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:42.809867719 +0000 UTC m=+1363.360234083" watchObservedRunningTime="2025-10-02 11:32:42.83218874 +0000 UTC m=+1363.382555104" Oct 02 11:32:42 crc kubenswrapper[4929]: E1002 11:32:42.832237 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" containerName="nova-scheduler-scheduler" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.832371 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" containerName="nova-scheduler-scheduler" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.832595 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" containerName="nova-scheduler-scheduler" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.833292 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.835661 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.851025 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.900503 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.900621 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbw96\" (UniqueName: \"kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:42 crc kubenswrapper[4929]: I1002 11:32:42.900767 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.004147 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.004485 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbw96\" (UniqueName: \"kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.004623 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.010712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.010901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.028846 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbw96\" (UniqueName: \"kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96\") pod \"nova-scheduler-0\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.195198 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.651549 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:43 crc kubenswrapper[4929]: W1002 11:32:43.670370 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc91c4a43_20c7_407e_8651_48f9956ba3da.slice/crio-60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9 WatchSource:0}: Error finding container 60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9: Status 404 returned error can't find the container with id 60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9 Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.761727 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.774629 4929 generic.go:334] "Generic (PLEG): container finished" podID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerID="0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5" exitCode=0 Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.774762 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerDied","Data":"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5"} Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.774796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cd89922-a8bc-4dee-b745-b7ac52350955","Type":"ContainerDied","Data":"bf822cbd82d6263af8c283aed57f183dc211d69652d74733cd15835493badc28"} Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.774819 4929 scope.go:117] "RemoveContainer" containerID="0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.775021 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.778172 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c91c4a43-20c7-407e-8651-48f9956ba3da","Type":"ContainerStarted","Data":"60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9"} Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.807283 4929 scope.go:117] "RemoveContainer" containerID="415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.822326 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data\") pod \"4cd89922-a8bc-4dee-b745-b7ac52350955\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.822402 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sblsm\" (UniqueName: \"kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm\") pod \"4cd89922-a8bc-4dee-b745-b7ac52350955\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.822458 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs\") pod \"4cd89922-a8bc-4dee-b745-b7ac52350955\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.822517 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle\") pod \"4cd89922-a8bc-4dee-b745-b7ac52350955\" (UID: \"4cd89922-a8bc-4dee-b745-b7ac52350955\") " Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.823456 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs" (OuterVolumeSpecName: "logs") pod "4cd89922-a8bc-4dee-b745-b7ac52350955" (UID: "4cd89922-a8bc-4dee-b745-b7ac52350955"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.823704 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cd89922-a8bc-4dee-b745-b7ac52350955-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.830044 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm" (OuterVolumeSpecName: "kube-api-access-sblsm") pod "4cd89922-a8bc-4dee-b745-b7ac52350955" (UID: "4cd89922-a8bc-4dee-b745-b7ac52350955"). InnerVolumeSpecName "kube-api-access-sblsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.831237 4929 scope.go:117] "RemoveContainer" containerID="0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5" Oct 02 11:32:43 crc kubenswrapper[4929]: E1002 11:32:43.831778 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5\": container with ID starting with 0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5 not found: ID does not exist" containerID="0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.831819 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5"} err="failed to get container status \"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5\": rpc error: code = NotFound desc = could not find container \"0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5\": container with ID starting with 0f2eb91e9c71d5bc7a227c3132b68beccd37377791d1190551cc6255ea4941f5 not found: ID does not exist" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.831845 4929 scope.go:117] "RemoveContainer" containerID="415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7" Oct 02 11:32:43 crc kubenswrapper[4929]: E1002 11:32:43.832549 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7\": container with ID starting with 415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7 not found: ID does not exist" containerID="415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.832582 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7"} err="failed to get container status \"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7\": rpc error: code = NotFound desc = could not find container \"415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7\": container with ID starting with 415c335f9656082e4ac49271432ed87576869c3f8fa00451fba6fd18c038e6a7 not found: ID does not exist" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.858176 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data" (OuterVolumeSpecName: "config-data") pod "4cd89922-a8bc-4dee-b745-b7ac52350955" (UID: "4cd89922-a8bc-4dee-b745-b7ac52350955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.858404 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cd89922-a8bc-4dee-b745-b7ac52350955" (UID: "4cd89922-a8bc-4dee-b745-b7ac52350955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.925542 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.925576 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sblsm\" (UniqueName: \"kubernetes.io/projected/4cd89922-a8bc-4dee-b745-b7ac52350955-kube-api-access-sblsm\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.925585 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cd89922-a8bc-4dee-b745-b7ac52350955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:43 crc kubenswrapper[4929]: I1002 11:32:43.989929 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.013797 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.124229 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.141619 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.152071 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:44 crc kubenswrapper[4929]: E1002 11:32:44.152563 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-log" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.152587 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-log" Oct 02 11:32:44 crc kubenswrapper[4929]: E1002 11:32:44.152601 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-api" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.152609 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-api" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.152839 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-log" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.152868 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" containerName="nova-api-api" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.153886 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.159490 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.176652 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd89922-a8bc-4dee-b745-b7ac52350955" path="/var/lib/kubelet/pods/4cd89922-a8bc-4dee-b745-b7ac52350955/volumes" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.177519 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7199f069-ddfe-477d-9253-1ca638cf2732" path="/var/lib/kubelet/pods/7199f069-ddfe-477d-9253-1ca638cf2732/volumes" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.178158 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.231482 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.231658 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw29v\" (UniqueName: \"kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.231787 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.232173 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.333646 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.334019 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.334070 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw29v\" (UniqueName: \"kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.334097 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.334488 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.351901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw29v\" (UniqueName: \"kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.352887 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.353427 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.469346 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.791741 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c91c4a43-20c7-407e-8651-48f9956ba3da","Type":"ContainerStarted","Data":"2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d"} Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.805172 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.805155178 podStartE2EDuration="2.805155178s" podCreationTimestamp="2025-10-02 11:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:44.803762608 +0000 UTC m=+1365.354128972" watchObservedRunningTime="2025-10-02 11:32:44.805155178 +0000 UTC m=+1365.355521552" Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.812010 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:32:44 crc kubenswrapper[4929]: W1002 11:32:44.914144 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd05d7753_47ab_40ab_b48a_ad9a67d207c4.slice/crio-9a05e9c86f3d308b3473e809efdfb16402e145e6c5ebfb32886a2d2003723082 WatchSource:0}: Error finding container 9a05e9c86f3d308b3473e809efdfb16402e145e6c5ebfb32886a2d2003723082: Status 404 returned error can't find the container with id 9a05e9c86f3d308b3473e809efdfb16402e145e6c5ebfb32886a2d2003723082 Oct 02 11:32:44 crc kubenswrapper[4929]: I1002 11:32:44.919447 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:45 crc kubenswrapper[4929]: I1002 11:32:45.807278 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerStarted","Data":"445a4669cfd1367d95c771ba012b31997713784976e57bab92cfb76f32dcc7e0"} Oct 02 11:32:45 crc kubenswrapper[4929]: I1002 11:32:45.807711 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerStarted","Data":"585d4372ec772d40f98a30f22a46df71321b6eed1d9cb23c362b9295f7e991b0"} Oct 02 11:32:45 crc kubenswrapper[4929]: I1002 11:32:45.807736 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerStarted","Data":"9a05e9c86f3d308b3473e809efdfb16402e145e6c5ebfb32886a2d2003723082"} Oct 02 11:32:45 crc kubenswrapper[4929]: I1002 11:32:45.828524 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8285039589999998 podStartE2EDuration="1.828503959s" podCreationTimestamp="2025-10-02 11:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:45.820067327 +0000 UTC m=+1366.370433691" watchObservedRunningTime="2025-10-02 11:32:45.828503959 +0000 UTC m=+1366.378870323" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.201837 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.202231 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.217320 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.674765 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mxwxk"] Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.676458 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.678792 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.679017 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.682968 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mxwxk"] Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.723515 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.786286 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.786598 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bkq2\" (UniqueName: \"kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.786779 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.786891 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.889118 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.889208 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.889338 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.889403 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bkq2\" (UniqueName: \"kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.896117 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.897577 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.897610 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.910486 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bkq2\" (UniqueName: \"kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2\") pod \"nova-cell1-cell-mapping-mxwxk\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:46 crc kubenswrapper[4929]: I1002 11:32:46.995550 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:47 crc kubenswrapper[4929]: I1002 11:32:47.434275 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mxwxk"] Oct 02 11:32:47 crc kubenswrapper[4929]: I1002 11:32:47.822842 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mxwxk" event={"ID":"8f111be9-1a7c-4249-9cd0-26df16b3bf63","Type":"ContainerStarted","Data":"c0eb3789f6d2a7ebb7bd842f108822fb6688be628ad39ba69601664e03c28d2a"} Oct 02 11:32:47 crc kubenswrapper[4929]: I1002 11:32:47.823201 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mxwxk" event={"ID":"8f111be9-1a7c-4249-9cd0-26df16b3bf63","Type":"ContainerStarted","Data":"dba0167f74e1a2544da5b4675b5c215b56a014c971827521ff2b9dca776c5cb8"} Oct 02 11:32:47 crc kubenswrapper[4929]: I1002 11:32:47.842832 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mxwxk" podStartSLOduration=1.842813332 podStartE2EDuration="1.842813332s" podCreationTimestamp="2025-10-02 11:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:32:47.838626842 +0000 UTC m=+1368.388993206" watchObservedRunningTime="2025-10-02 11:32:47.842813332 +0000 UTC m=+1368.393179686" Oct 02 11:32:48 crc kubenswrapper[4929]: I1002 11:32:48.196141 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:32:51 crc kubenswrapper[4929]: I1002 11:32:51.201799 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:32:51 crc kubenswrapper[4929]: I1002 11:32:51.202335 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:32:52 crc kubenswrapper[4929]: I1002 11:32:52.215108 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:32:52 crc kubenswrapper[4929]: I1002 11:32:52.215129 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:32:52 crc kubenswrapper[4929]: I1002 11:32:52.874796 4929 generic.go:334] "Generic (PLEG): container finished" podID="8f111be9-1a7c-4249-9cd0-26df16b3bf63" containerID="c0eb3789f6d2a7ebb7bd842f108822fb6688be628ad39ba69601664e03c28d2a" exitCode=0 Oct 02 11:32:52 crc kubenswrapper[4929]: I1002 11:32:52.874853 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mxwxk" event={"ID":"8f111be9-1a7c-4249-9cd0-26df16b3bf63","Type":"ContainerDied","Data":"c0eb3789f6d2a7ebb7bd842f108822fb6688be628ad39ba69601664e03c28d2a"} Oct 02 11:32:53 crc kubenswrapper[4929]: I1002 11:32:53.196382 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:32:53 crc kubenswrapper[4929]: I1002 11:32:53.230562 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:32:53 crc kubenswrapper[4929]: I1002 11:32:53.918689 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.277456 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.341689 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle\") pod \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.341759 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data\") pod \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.341805 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bkq2\" (UniqueName: \"kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2\") pod \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.341856 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts\") pod \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\" (UID: \"8f111be9-1a7c-4249-9cd0-26df16b3bf63\") " Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.349062 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts" (OuterVolumeSpecName: "scripts") pod "8f111be9-1a7c-4249-9cd0-26df16b3bf63" (UID: "8f111be9-1a7c-4249-9cd0-26df16b3bf63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.361252 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2" (OuterVolumeSpecName: "kube-api-access-6bkq2") pod "8f111be9-1a7c-4249-9cd0-26df16b3bf63" (UID: "8f111be9-1a7c-4249-9cd0-26df16b3bf63"). InnerVolumeSpecName "kube-api-access-6bkq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.382615 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data" (OuterVolumeSpecName: "config-data") pod "8f111be9-1a7c-4249-9cd0-26df16b3bf63" (UID: "8f111be9-1a7c-4249-9cd0-26df16b3bf63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.382713 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f111be9-1a7c-4249-9cd0-26df16b3bf63" (UID: "8f111be9-1a7c-4249-9cd0-26df16b3bf63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.444147 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.444434 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.444444 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bkq2\" (UniqueName: \"kubernetes.io/projected/8f111be9-1a7c-4249-9cd0-26df16b3bf63-kube-api-access-6bkq2\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.444456 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f111be9-1a7c-4249-9cd0-26df16b3bf63-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.470300 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.470337 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.894147 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mxwxk" Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.894288 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mxwxk" event={"ID":"8f111be9-1a7c-4249-9cd0-26df16b3bf63","Type":"ContainerDied","Data":"dba0167f74e1a2544da5b4675b5c215b56a014c971827521ff2b9dca776c5cb8"} Oct 02 11:32:54 crc kubenswrapper[4929]: I1002 11:32:54.894321 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba0167f74e1a2544da5b4675b5c215b56a014c971827521ff2b9dca776c5cb8" Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.082541 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.082836 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-log" containerID="cri-o://585d4372ec772d40f98a30f22a46df71321b6eed1d9cb23c362b9295f7e991b0" gracePeriod=30 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.082867 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-api" containerID="cri-o://445a4669cfd1367d95c771ba012b31997713784976e57bab92cfb76f32dcc7e0" gracePeriod=30 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.094938 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.114923 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.115231 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-log" containerID="cri-o://de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a" gracePeriod=30 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.115391 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-metadata" containerID="cri-o://d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63" gracePeriod=30 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.133685 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.133807 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.904504 4929 generic.go:334] "Generic (PLEG): container finished" podID="be0f785a-7ca6-4c70-acda-848269251cca" containerID="de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a" exitCode=143 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.904586 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerDied","Data":"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a"} Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.906521 4929 generic.go:334] "Generic (PLEG): container finished" podID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerID="585d4372ec772d40f98a30f22a46df71321b6eed1d9cb23c362b9295f7e991b0" exitCode=143 Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.906606 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerDied","Data":"585d4372ec772d40f98a30f22a46df71321b6eed1d9cb23c362b9295f7e991b0"} Oct 02 11:32:55 crc kubenswrapper[4929]: I1002 11:32:55.906703 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerName="nova-scheduler-scheduler" containerID="cri-o://2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" gracePeriod=30 Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.198380 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.200729 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.203131 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.203183 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerName="nova-scheduler-scheduler" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.726258 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848294 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs\") pod \"be0f785a-7ca6-4c70-acda-848269251cca\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848530 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle\") pod \"be0f785a-7ca6-4c70-acda-848269251cca\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848569 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmxj6\" (UniqueName: \"kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6\") pod \"be0f785a-7ca6-4c70-acda-848269251cca\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848640 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data\") pod \"be0f785a-7ca6-4c70-acda-848269251cca\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848696 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs\") pod \"be0f785a-7ca6-4c70-acda-848269251cca\" (UID: \"be0f785a-7ca6-4c70-acda-848269251cca\") " Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.848979 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs" (OuterVolumeSpecName: "logs") pod "be0f785a-7ca6-4c70-acda-848269251cca" (UID: "be0f785a-7ca6-4c70-acda-848269251cca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.849648 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be0f785a-7ca6-4c70-acda-848269251cca-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.855356 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6" (OuterVolumeSpecName: "kube-api-access-tmxj6") pod "be0f785a-7ca6-4c70-acda-848269251cca" (UID: "be0f785a-7ca6-4c70-acda-848269251cca"). InnerVolumeSpecName "kube-api-access-tmxj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.875277 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be0f785a-7ca6-4c70-acda-848269251cca" (UID: "be0f785a-7ca6-4c70-acda-848269251cca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.876217 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data" (OuterVolumeSpecName: "config-data") pod "be0f785a-7ca6-4c70-acda-848269251cca" (UID: "be0f785a-7ca6-4c70-acda-848269251cca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.911090 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "be0f785a-7ca6-4c70-acda-848269251cca" (UID: "be0f785a-7ca6-4c70-acda-848269251cca"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.945282 4929 generic.go:334] "Generic (PLEG): container finished" podID="be0f785a-7ca6-4c70-acda-848269251cca" containerID="d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63" exitCode=0 Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.945331 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.945333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerDied","Data":"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63"} Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.945364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"be0f785a-7ca6-4c70-acda-848269251cca","Type":"ContainerDied","Data":"b61993ff6851b34a15bb4fda5ac24f1217d1c62834f6dbd5e1d83c7d36db1d60"} Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.945383 4929 scope.go:117] "RemoveContainer" containerID="d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.952086 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.952122 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmxj6\" (UniqueName: \"kubernetes.io/projected/be0f785a-7ca6-4c70-acda-848269251cca-kube-api-access-tmxj6\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.952136 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.952145 4929 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/be0f785a-7ca6-4c70-acda-848269251cca-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.976299 4929 scope.go:117] "RemoveContainer" containerID="de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.983412 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.997169 4929 scope.go:117] "RemoveContainer" containerID="d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63" Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.998374 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63\": container with ID starting with d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63 not found: ID does not exist" containerID="d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.998491 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63"} err="failed to get container status \"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63\": rpc error: code = NotFound desc = could not find container \"d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63\": container with ID starting with d167dd381639c3f75bf8234a3dac580a6878a696f8f768be1b4b95779b8d5d63 not found: ID does not exist" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.998587 4929 scope.go:117] "RemoveContainer" containerID="de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a" Oct 02 11:32:58 crc kubenswrapper[4929]: E1002 11:32:58.999015 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a\": container with ID starting with de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a not found: ID does not exist" containerID="de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a" Oct 02 11:32:58 crc kubenswrapper[4929]: I1002 11:32:58.999047 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a"} err="failed to get container status \"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a\": rpc error: code = NotFound desc = could not find container \"de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a\": container with ID starting with de3700d9e8f449d3bb826667c9a749d17af0b8166ad63e1787c4831e87db512a not found: ID does not exist" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.005397 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015055 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:59 crc kubenswrapper[4929]: E1002 11:32:59.015542 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-log" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015559 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-log" Oct 02 11:32:59 crc kubenswrapper[4929]: E1002 11:32:59.015574 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-metadata" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015580 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-metadata" Oct 02 11:32:59 crc kubenswrapper[4929]: E1002 11:32:59.015604 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f111be9-1a7c-4249-9cd0-26df16b3bf63" containerName="nova-manage" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015609 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f111be9-1a7c-4249-9cd0-26df16b3bf63" containerName="nova-manage" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015790 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f111be9-1a7c-4249-9cd0-26df16b3bf63" containerName="nova-manage" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015803 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-log" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.015813 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0f785a-7ca6-4c70-acda-848269251cca" containerName="nova-metadata-metadata" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.016782 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.021079 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.021311 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.028929 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.154937 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpsnz\" (UniqueName: \"kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.155149 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.155462 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.155559 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.155700 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257069 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257165 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpsnz\" (UniqueName: \"kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257266 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257425 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257490 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.257522 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.262626 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.263183 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.264164 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.276189 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpsnz\" (UniqueName: \"kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz\") pod \"nova-metadata-0\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.342924 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.767487 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:32:59 crc kubenswrapper[4929]: I1002 11:32:59.954397 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerStarted","Data":"f05b53b0cabfeb9ac4a426b2f97a57b668cc6c38069d683b71c0b821afa7c510"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.172139 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0f785a-7ca6-4c70-acda-848269251cca" path="/var/lib/kubelet/pods/be0f785a-7ca6-4c70-acda-848269251cca/volumes" Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.966589 4929 generic.go:334] "Generic (PLEG): container finished" podID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerID="2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" exitCode=0 Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.966686 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c91c4a43-20c7-407e-8651-48f9956ba3da","Type":"ContainerDied","Data":"2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.967173 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c91c4a43-20c7-407e-8651-48f9956ba3da","Type":"ContainerDied","Data":"60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.967188 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b5bd8d6df3aa34987e52e1d32bfce01d74cc29af05b9a58d63d7884dabb3f9" Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.969227 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerStarted","Data":"b99a17b7ec88652946955b5fdf985f5b9d3d8bd15ef24dfadcf98117eac94d02"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.969281 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerStarted","Data":"3153b1529e0c32bf3f30edef4acbd966c8aa5b2583cf6efe7dd0a6f5cab02ebd"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.971888 4929 generic.go:334] "Generic (PLEG): container finished" podID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerID="445a4669cfd1367d95c771ba012b31997713784976e57bab92cfb76f32dcc7e0" exitCode=0 Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.971927 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerDied","Data":"445a4669cfd1367d95c771ba012b31997713784976e57bab92cfb76f32dcc7e0"} Oct 02 11:33:00 crc kubenswrapper[4929]: I1002 11:33:00.992900 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.992880336 podStartE2EDuration="2.992880336s" podCreationTimestamp="2025-10-02 11:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:00.991370632 +0000 UTC m=+1381.541736996" watchObservedRunningTime="2025-10-02 11:33:00.992880336 +0000 UTC m=+1381.543246700" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.002885 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.087154 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbw96\" (UniqueName: \"kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96\") pod \"c91c4a43-20c7-407e-8651-48f9956ba3da\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.087277 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle\") pod \"c91c4a43-20c7-407e-8651-48f9956ba3da\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.087310 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data\") pod \"c91c4a43-20c7-407e-8651-48f9956ba3da\" (UID: \"c91c4a43-20c7-407e-8651-48f9956ba3da\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.092241 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96" (OuterVolumeSpecName: "kube-api-access-cbw96") pod "c91c4a43-20c7-407e-8651-48f9956ba3da" (UID: "c91c4a43-20c7-407e-8651-48f9956ba3da"). InnerVolumeSpecName "kube-api-access-cbw96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.119204 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data" (OuterVolumeSpecName: "config-data") pod "c91c4a43-20c7-407e-8651-48f9956ba3da" (UID: "c91c4a43-20c7-407e-8651-48f9956ba3da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.119769 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c91c4a43-20c7-407e-8651-48f9956ba3da" (UID: "c91c4a43-20c7-407e-8651-48f9956ba3da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.188902 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbw96\" (UniqueName: \"kubernetes.io/projected/c91c4a43-20c7-407e-8651-48f9956ba3da-kube-api-access-cbw96\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.188938 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.188948 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c91c4a43-20c7-407e-8651-48f9956ba3da-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.516813 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.594888 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs\") pod \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.595016 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle\") pod \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.595074 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw29v\" (UniqueName: \"kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v\") pod \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.595297 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data\") pod \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\" (UID: \"d05d7753-47ab-40ab-b48a-ad9a67d207c4\") " Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.595405 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs" (OuterVolumeSpecName: "logs") pod "d05d7753-47ab-40ab-b48a-ad9a67d207c4" (UID: "d05d7753-47ab-40ab-b48a-ad9a67d207c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.595796 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05d7753-47ab-40ab-b48a-ad9a67d207c4-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.614028 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v" (OuterVolumeSpecName: "kube-api-access-hw29v") pod "d05d7753-47ab-40ab-b48a-ad9a67d207c4" (UID: "d05d7753-47ab-40ab-b48a-ad9a67d207c4"). InnerVolumeSpecName "kube-api-access-hw29v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.645351 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d05d7753-47ab-40ab-b48a-ad9a67d207c4" (UID: "d05d7753-47ab-40ab-b48a-ad9a67d207c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.665372 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data" (OuterVolumeSpecName: "config-data") pod "d05d7753-47ab-40ab-b48a-ad9a67d207c4" (UID: "d05d7753-47ab-40ab-b48a-ad9a67d207c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.697328 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.697354 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05d7753-47ab-40ab-b48a-ad9a67d207c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.697366 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw29v\" (UniqueName: \"kubernetes.io/projected/d05d7753-47ab-40ab-b48a-ad9a67d207c4-kube-api-access-hw29v\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.982551 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d05d7753-47ab-40ab-b48a-ad9a67d207c4","Type":"ContainerDied","Data":"9a05e9c86f3d308b3473e809efdfb16402e145e6c5ebfb32886a2d2003723082"} Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.982611 4929 scope.go:117] "RemoveContainer" containerID="445a4669cfd1367d95c771ba012b31997713784976e57bab92cfb76f32dcc7e0" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.982626 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:01 crc kubenswrapper[4929]: I1002 11:33:01.982613 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.020059 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.032539 4929 scope.go:117] "RemoveContainer" containerID="585d4372ec772d40f98a30f22a46df71321b6eed1d9cb23c362b9295f7e991b0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.047290 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.058341 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.069644 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.079508 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: E1002 11:33:02.080265 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerName="nova-scheduler-scheduler" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080291 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerName="nova-scheduler-scheduler" Oct 02 11:33:02 crc kubenswrapper[4929]: E1002 11:33:02.080307 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-api" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080316 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-api" Oct 02 11:33:02 crc kubenswrapper[4929]: E1002 11:33:02.080336 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-log" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080344 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-log" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080594 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" containerName="nova-scheduler-scheduler" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080617 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-log" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.080640 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" containerName="nova-api-api" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.081501 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.086633 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.092579 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.094126 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.096626 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.104981 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.113621 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.167600 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91c4a43-20c7-407e-8651-48f9956ba3da" path="/var/lib/kubelet/pods/c91c4a43-20c7-407e-8651-48f9956ba3da/volumes" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.168325 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05d7753-47ab-40ab-b48a-ad9a67d207c4" path="/var/lib/kubelet/pods/d05d7753-47ab-40ab-b48a-ad9a67d207c4/volumes" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206401 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206491 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206559 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206709 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts6kf\" (UniqueName: \"kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206821 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xt9z\" (UniqueName: \"kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206878 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.206979 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309109 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309182 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309229 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309296 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts6kf\" (UniqueName: \"kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309373 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xt9z\" (UniqueName: \"kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309407 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.309431 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.310492 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.314389 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.314465 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.314898 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.315320 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.330480 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts6kf\" (UniqueName: \"kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf\") pod \"nova-scheduler-0\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.333907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xt9z\" (UniqueName: \"kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z\") pod \"nova-api-0\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " pod="openstack/nova-api-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.399391 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:33:02 crc kubenswrapper[4929]: I1002 11:33:02.417043 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:03 crc kubenswrapper[4929]: I1002 11:33:02.861051 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:33:03 crc kubenswrapper[4929]: W1002 11:33:02.868032 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ba6072c_759c_4261_8107_8243d262003d.slice/crio-e5ae839ebb89fab5d23c50ba7108f6172c2b628a0a54fcf3daf0de30f391e438 WatchSource:0}: Error finding container e5ae839ebb89fab5d23c50ba7108f6172c2b628a0a54fcf3daf0de30f391e438: Status 404 returned error can't find the container with id e5ae839ebb89fab5d23c50ba7108f6172c2b628a0a54fcf3daf0de30f391e438 Oct 02 11:33:03 crc kubenswrapper[4929]: I1002 11:33:02.933635 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:03 crc kubenswrapper[4929]: W1002 11:33:02.934421 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod223e0108_acc5_48ef_8875_09a096eb3ded.slice/crio-ed48e61e2eb93d7cf64afa5f64a9e34c48e8c3a7c14d2f245baf1a1fb09808a6 WatchSource:0}: Error finding container ed48e61e2eb93d7cf64afa5f64a9e34c48e8c3a7c14d2f245baf1a1fb09808a6: Status 404 returned error can't find the container with id ed48e61e2eb93d7cf64afa5f64a9e34c48e8c3a7c14d2f245baf1a1fb09808a6 Oct 02 11:33:03 crc kubenswrapper[4929]: I1002 11:33:02.992032 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba6072c-759c-4261-8107-8243d262003d","Type":"ContainerStarted","Data":"e5ae839ebb89fab5d23c50ba7108f6172c2b628a0a54fcf3daf0de30f391e438"} Oct 02 11:33:03 crc kubenswrapper[4929]: I1002 11:33:02.993558 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerStarted","Data":"ed48e61e2eb93d7cf64afa5f64a9e34c48e8c3a7c14d2f245baf1a1fb09808a6"} Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.005223 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba6072c-759c-4261-8107-8243d262003d","Type":"ContainerStarted","Data":"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383"} Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.009099 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerStarted","Data":"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd"} Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.009144 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerStarted","Data":"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203"} Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.021815 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.02179728 podStartE2EDuration="2.02179728s" podCreationTimestamp="2025-10-02 11:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:04.019240117 +0000 UTC m=+1384.569606491" watchObservedRunningTime="2025-10-02 11:33:04.02179728 +0000 UTC m=+1384.572163634" Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.044785 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044756809 podStartE2EDuration="2.044756809s" podCreationTimestamp="2025-10-02 11:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:04.035479393 +0000 UTC m=+1384.585845767" watchObservedRunningTime="2025-10-02 11:33:04.044756809 +0000 UTC m=+1384.595123183" Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.343303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:33:04 crc kubenswrapper[4929]: I1002 11:33:04.343366 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:33:07 crc kubenswrapper[4929]: I1002 11:33:07.400026 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:33:09 crc kubenswrapper[4929]: I1002 11:33:09.343172 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:33:09 crc kubenswrapper[4929]: I1002 11:33:09.343505 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:33:10 crc kubenswrapper[4929]: I1002 11:33:10.355144 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:10 crc kubenswrapper[4929]: I1002 11:33:10.355182 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:12 crc kubenswrapper[4929]: I1002 11:33:12.399949 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:33:12 crc kubenswrapper[4929]: I1002 11:33:12.418101 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:33:12 crc kubenswrapper[4929]: I1002 11:33:12.418182 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:33:12 crc kubenswrapper[4929]: I1002 11:33:12.427079 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:33:13 crc kubenswrapper[4929]: I1002 11:33:13.134868 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:33:13 crc kubenswrapper[4929]: I1002 11:33:13.501152 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:13 crc kubenswrapper[4929]: I1002 11:33:13.501205 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:19 crc kubenswrapper[4929]: I1002 11:33:19.349784 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:33:19 crc kubenswrapper[4929]: I1002 11:33:19.355567 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:33:19 crc kubenswrapper[4929]: I1002 11:33:19.356858 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:33:20 crc kubenswrapper[4929]: I1002 11:33:20.214828 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.422121 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.422676 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.423060 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.423086 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.426110 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.427253 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.608000 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.609550 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.632278 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730215 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730334 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730383 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730421 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730446 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mpd\" (UniqueName: \"kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.730514 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.831832 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832215 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832510 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832642 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832745 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832844 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mpd\" (UniqueName: \"kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.832913 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.833105 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.833297 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.833771 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.834212 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.857253 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mpd\" (UniqueName: \"kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd\") pod \"dnsmasq-dns-cd5cbd7b9-6bjrl\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:22 crc kubenswrapper[4929]: I1002 11:33:22.933191 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:23 crc kubenswrapper[4929]: I1002 11:33:23.409904 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:33:23 crc kubenswrapper[4929]: W1002 11:33:23.418508 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod692c9c38_07d7_455f_8d9c_984904aef051.slice/crio-37d13757387e0e0ed1cc4e1a877cea686e239b0e872288b067ae4fd7a59db191 WatchSource:0}: Error finding container 37d13757387e0e0ed1cc4e1a877cea686e239b0e872288b067ae4fd7a59db191: Status 404 returned error can't find the container with id 37d13757387e0e0ed1cc4e1a877cea686e239b0e872288b067ae4fd7a59db191 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.247582 4929 generic.go:334] "Generic (PLEG): container finished" podID="692c9c38-07d7-455f-8d9c-984904aef051" containerID="ca836523d4cd1ebebcad4ee242c555398051bedcf444894c067e5f3d2639700e" exitCode=0 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.247684 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" event={"ID":"692c9c38-07d7-455f-8d9c-984904aef051","Type":"ContainerDied","Data":"ca836523d4cd1ebebcad4ee242c555398051bedcf444894c067e5f3d2639700e"} Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.248168 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" event={"ID":"692c9c38-07d7-455f-8d9c-984904aef051","Type":"ContainerStarted","Data":"37d13757387e0e0ed1cc4e1a877cea686e239b0e872288b067ae4fd7a59db191"} Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.585935 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.586482 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-central-agent" containerID="cri-o://378198d3bdaf31b39f05c2d9998458f2601f1eb2b7ed72301e1c3f101ebfc685" gracePeriod=30 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.586557 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="proxy-httpd" containerID="cri-o://7475865ab42c5bb80c1c3b6feab81e267357d53bd3070ce6eccef6bf8fb07380" gracePeriod=30 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.586534 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="sg-core" containerID="cri-o://e9a6ecf6c8e71cb48a05fe5105a51cfc13f258d7e3bbe228ec11f1aa2ef040d2" gracePeriod=30 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.586534 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-notification-agent" containerID="cri-o://dece7c301f7b642703a0082328c976ccce23b7c915e41c9d02977815199f35bb" gracePeriod=30 Oct 02 11:33:24 crc kubenswrapper[4929]: I1002 11:33:24.859295 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.259228 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" event={"ID":"692c9c38-07d7-455f-8d9c-984904aef051","Type":"ContainerStarted","Data":"aae3e40e81be663620ecc6be606854b80d924a650235106f6750681901686f12"} Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.260517 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263296 4929 generic.go:334] "Generic (PLEG): container finished" podID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerID="7475865ab42c5bb80c1c3b6feab81e267357d53bd3070ce6eccef6bf8fb07380" exitCode=0 Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263323 4929 generic.go:334] "Generic (PLEG): container finished" podID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerID="e9a6ecf6c8e71cb48a05fe5105a51cfc13f258d7e3bbe228ec11f1aa2ef040d2" exitCode=2 Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263333 4929 generic.go:334] "Generic (PLEG): container finished" podID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerID="378198d3bdaf31b39f05c2d9998458f2601f1eb2b7ed72301e1c3f101ebfc685" exitCode=0 Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263511 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-log" containerID="cri-o://19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203" gracePeriod=30 Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263757 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerDied","Data":"7475865ab42c5bb80c1c3b6feab81e267357d53bd3070ce6eccef6bf8fb07380"} Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerDied","Data":"e9a6ecf6c8e71cb48a05fe5105a51cfc13f258d7e3bbe228ec11f1aa2ef040d2"} Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263799 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerDied","Data":"378198d3bdaf31b39f05c2d9998458f2601f1eb2b7ed72301e1c3f101ebfc685"} Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.263856 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-api" containerID="cri-o://0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd" gracePeriod=30 Oct 02 11:33:25 crc kubenswrapper[4929]: I1002 11:33:25.284747 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" podStartSLOduration=3.28473252 podStartE2EDuration="3.28473252s" podCreationTimestamp="2025-10-02 11:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:25.280605242 +0000 UTC m=+1405.830971606" watchObservedRunningTime="2025-10-02 11:33:25.28473252 +0000 UTC m=+1405.835098884" Oct 02 11:33:26 crc kubenswrapper[4929]: I1002 11:33:26.274107 4929 generic.go:334] "Generic (PLEG): container finished" podID="223e0108-acc5-48ef-8875-09a096eb3ded" containerID="19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203" exitCode=143 Oct 02 11:33:26 crc kubenswrapper[4929]: I1002 11:33:26.274236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerDied","Data":"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203"} Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.816532 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.951638 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data\") pod \"223e0108-acc5-48ef-8875-09a096eb3ded\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.951874 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs\") pod \"223e0108-acc5-48ef-8875-09a096eb3ded\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.951922 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle\") pod \"223e0108-acc5-48ef-8875-09a096eb3ded\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.952038 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xt9z\" (UniqueName: \"kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z\") pod \"223e0108-acc5-48ef-8875-09a096eb3ded\" (UID: \"223e0108-acc5-48ef-8875-09a096eb3ded\") " Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.952877 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs" (OuterVolumeSpecName: "logs") pod "223e0108-acc5-48ef-8875-09a096eb3ded" (UID: "223e0108-acc5-48ef-8875-09a096eb3ded"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.962376 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z" (OuterVolumeSpecName: "kube-api-access-2xt9z") pod "223e0108-acc5-48ef-8875-09a096eb3ded" (UID: "223e0108-acc5-48ef-8875-09a096eb3ded"). InnerVolumeSpecName "kube-api-access-2xt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:28 crc kubenswrapper[4929]: I1002 11:33:28.992262 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data" (OuterVolumeSpecName: "config-data") pod "223e0108-acc5-48ef-8875-09a096eb3ded" (UID: "223e0108-acc5-48ef-8875-09a096eb3ded"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.025179 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "223e0108-acc5-48ef-8875-09a096eb3ded" (UID: "223e0108-acc5-48ef-8875-09a096eb3ded"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.054343 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xt9z\" (UniqueName: \"kubernetes.io/projected/223e0108-acc5-48ef-8875-09a096eb3ded-kube-api-access-2xt9z\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.054387 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.054398 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/223e0108-acc5-48ef-8875-09a096eb3ded-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.054406 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223e0108-acc5-48ef-8875-09a096eb3ded-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.301170 4929 generic.go:334] "Generic (PLEG): container finished" podID="223e0108-acc5-48ef-8875-09a096eb3ded" containerID="0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd" exitCode=0 Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.301525 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerDied","Data":"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd"} Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.301559 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"223e0108-acc5-48ef-8875-09a096eb3ded","Type":"ContainerDied","Data":"ed48e61e2eb93d7cf64afa5f64a9e34c48e8c3a7c14d2f245baf1a1fb09808a6"} Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.301577 4929 scope.go:117] "RemoveContainer" containerID="0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.301737 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.357101 4929 scope.go:117] "RemoveContainer" containerID="19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.389046 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.403273 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.426378 4929 scope.go:117] "RemoveContainer" containerID="0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd" Oct 02 11:33:29 crc kubenswrapper[4929]: E1002 11:33:29.430762 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd\": container with ID starting with 0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd not found: ID does not exist" containerID="0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.430823 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd"} err="failed to get container status \"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd\": rpc error: code = NotFound desc = could not find container \"0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd\": container with ID starting with 0d8f846936acad224a7d54e9cd34b118973eed96ea3e7173b26e4f417d4f34bd not found: ID does not exist" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.430858 4929 scope.go:117] "RemoveContainer" containerID="19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.431000 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:29 crc kubenswrapper[4929]: E1002 11:33:29.431498 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-api" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.431520 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-api" Oct 02 11:33:29 crc kubenswrapper[4929]: E1002 11:33:29.431567 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-log" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.431575 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-log" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.431880 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-api" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.432048 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" containerName="nova-api-log" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.434597 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: E1002 11:33:29.438193 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203\": container with ID starting with 19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203 not found: ID does not exist" containerID="19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.438247 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203"} err="failed to get container status \"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203\": rpc error: code = NotFound desc = could not find container \"19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203\": container with ID starting with 19dbe5af6d1a5033dcdafc5c0ada457c472fd2f4874174ffa5a2883fcbac0203 not found: ID does not exist" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.453333 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.453784 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.454165 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.456219 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.566810 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.567001 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.567066 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8th\" (UniqueName: \"kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.567169 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.567359 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.567465 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.669799 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.669889 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.669923 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.669994 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.670047 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.670069 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8th\" (UniqueName: \"kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.670513 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.674888 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.675297 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.675569 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.676031 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.688669 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8th\" (UniqueName: \"kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th\") pod \"nova-api-0\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " pod="openstack/nova-api-0" Oct 02 11:33:29 crc kubenswrapper[4929]: I1002 11:33:29.775268 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:33:30 crc kubenswrapper[4929]: I1002 11:33:30.169172 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223e0108-acc5-48ef-8875-09a096eb3ded" path="/var/lib/kubelet/pods/223e0108-acc5-48ef-8875-09a096eb3ded/volumes" Oct 02 11:33:30 crc kubenswrapper[4929]: I1002 11:33:30.207060 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:33:30 crc kubenswrapper[4929]: I1002 11:33:30.312598 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerStarted","Data":"2b01a875553d4fc10378af9772d0e06b0e374db11e481f765d528a5b5b557c83"} Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.336564 4929 generic.go:334] "Generic (PLEG): container finished" podID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerID="dece7c301f7b642703a0082328c976ccce23b7c915e41c9d02977815199f35bb" exitCode=0 Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.336699 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerDied","Data":"dece7c301f7b642703a0082328c976ccce23b7c915e41c9d02977815199f35bb"} Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.340570 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerStarted","Data":"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1"} Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.340607 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerStarted","Data":"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd"} Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.366403 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3663764 podStartE2EDuration="2.3663764s" podCreationTimestamp="2025-10-02 11:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:33:31.360752989 +0000 UTC m=+1411.911119373" watchObservedRunningTime="2025-10-02 11:33:31.3663764 +0000 UTC m=+1411.916742764" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.455067 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.608831 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.608921 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609042 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609075 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609111 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609152 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609206 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2lk\" (UniqueName: \"kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609230 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd\") pod \"3d3a3351-f020-46b3-b66b-0a94aee376c6\" (UID: \"3d3a3351-f020-46b3-b66b-0a94aee376c6\") " Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.609848 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.610085 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.615677 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts" (OuterVolumeSpecName: "scripts") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.616925 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk" (OuterVolumeSpecName: "kube-api-access-kd2lk") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "kube-api-access-kd2lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.646459 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.686565 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.708329 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712081 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712119 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712132 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712144 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712159 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712171 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2lk\" (UniqueName: \"kubernetes.io/projected/3d3a3351-f020-46b3-b66b-0a94aee376c6-kube-api-access-kd2lk\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.712184 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d3a3351-f020-46b3-b66b-0a94aee376c6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.734027 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data" (OuterVolumeSpecName: "config-data") pod "3d3a3351-f020-46b3-b66b-0a94aee376c6" (UID: "3d3a3351-f020-46b3-b66b-0a94aee376c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:33:31 crc kubenswrapper[4929]: I1002 11:33:31.813403 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3a3351-f020-46b3-b66b-0a94aee376c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.357951 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.358850 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d3a3351-f020-46b3-b66b-0a94aee376c6","Type":"ContainerDied","Data":"1e6e3ef3e94ff4e9507c5fb38f0e4259341e1b0366694b2a146c4068f7cb1cb4"} Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.358900 4929 scope.go:117] "RemoveContainer" containerID="7475865ab42c5bb80c1c3b6feab81e267357d53bd3070ce6eccef6bf8fb07380" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.387798 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.402934 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.403804 4929 scope.go:117] "RemoveContainer" containerID="e9a6ecf6c8e71cb48a05fe5105a51cfc13f258d7e3bbe228ec11f1aa2ef040d2" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416081 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:32 crc kubenswrapper[4929]: E1002 11:33:32.416580 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="sg-core" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416605 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="sg-core" Oct 02 11:33:32 crc kubenswrapper[4929]: E1002 11:33:32.416622 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="proxy-httpd" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416631 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="proxy-httpd" Oct 02 11:33:32 crc kubenswrapper[4929]: E1002 11:33:32.416699 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-central-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416708 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-central-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: E1002 11:33:32.416720 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-notification-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416730 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-notification-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416903 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="sg-core" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416929 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-notification-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416944 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="ceilometer-central-agent" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.416973 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" containerName="proxy-httpd" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.418579 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.425520 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.425672 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.425714 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.447363 4929 scope.go:117] "RemoveContainer" containerID="dece7c301f7b642703a0082328c976ccce23b7c915e41c9d02977815199f35bb" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.449244 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.479017 4929 scope.go:117] "RemoveContainer" containerID="378198d3bdaf31b39f05c2d9998458f2601f1eb2b7ed72301e1c3f101ebfc685" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527186 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527253 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527601 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527710 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527767 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.527954 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.528129 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.528321 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8n82\" (UniqueName: \"kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630102 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630209 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630237 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8n82\" (UniqueName: \"kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630272 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630303 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630363 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630398 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.630417 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.631565 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.631571 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.636515 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.636683 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.637004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.638409 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.638570 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.650796 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8n82\" (UniqueName: \"kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82\") pod \"ceilometer-0\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.752045 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.936056 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.998100 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:33:32 crc kubenswrapper[4929]: I1002 11:33:32.998380 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="dnsmasq-dns" containerID="cri-o://a27c42cfbaf310401f831330bf840bc30a6282e10cdf25d8baa99d6fb094e34b" gracePeriod=10 Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.204406 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.230784 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.374257 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerStarted","Data":"94e3355a2acf155dec4f8266e43690cd87630399c3d1cbf5e7d488f92a2c1022"} Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.376454 4929 generic.go:334] "Generic (PLEG): container finished" podID="03478cff-4797-4d9b-82f6-d5588149c889" containerID="a27c42cfbaf310401f831330bf840bc30a6282e10cdf25d8baa99d6fb094e34b" exitCode=0 Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.376490 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" event={"ID":"03478cff-4797-4d9b-82f6-d5588149c889","Type":"ContainerDied","Data":"a27c42cfbaf310401f831330bf840bc30a6282e10cdf25d8baa99d6fb094e34b"} Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.426611 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.548635 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzvvv\" (UniqueName: \"kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.548744 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.548881 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.548952 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.549025 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.549087 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb\") pod \"03478cff-4797-4d9b-82f6-d5588149c889\" (UID: \"03478cff-4797-4d9b-82f6-d5588149c889\") " Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.554575 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv" (OuterVolumeSpecName: "kube-api-access-dzvvv") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "kube-api-access-dzvvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.602717 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.605575 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config" (OuterVolumeSpecName: "config") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.606122 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.615050 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.627316 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03478cff-4797-4d9b-82f6-d5588149c889" (UID: "03478cff-4797-4d9b-82f6-d5588149c889"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651823 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651868 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzvvv\" (UniqueName: \"kubernetes.io/projected/03478cff-4797-4d9b-82f6-d5588149c889-kube-api-access-dzvvv\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651882 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651894 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651906 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:33 crc kubenswrapper[4929]: I1002 11:33:33.651916 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03478cff-4797-4d9b-82f6-d5588149c889-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.179090 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3a3351-f020-46b3-b66b-0a94aee376c6" path="/var/lib/kubelet/pods/3d3a3351-f020-46b3-b66b-0a94aee376c6/volumes" Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.388832 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" event={"ID":"03478cff-4797-4d9b-82f6-d5588149c889","Type":"ContainerDied","Data":"e6485ec176d23ea4a7871568db6d85e9bc771304f1a751c94fc3a9980b0002da"} Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.388928 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qkcqx" Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.389046 4929 scope.go:117] "RemoveContainer" containerID="a27c42cfbaf310401f831330bf840bc30a6282e10cdf25d8baa99d6fb094e34b" Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.390933 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerStarted","Data":"83423c2654ca49651439a144e3eff0c4e3371ed929b89d62422a53c0a6be0dea"} Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.418315 4929 scope.go:117] "RemoveContainer" containerID="f12080b65b6bea1b6c3b5c515a3991fd750e4b7f9ea6cd5b70da2316dbab77cd" Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.418826 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:33:34 crc kubenswrapper[4929]: I1002 11:33:34.435729 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qkcqx"] Oct 02 11:33:35 crc kubenswrapper[4929]: I1002 11:33:35.402068 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerStarted","Data":"26cde39221ac8a2072a3fb8c38cbfe2e085b51f160c11eafe83d008ddf719bf8"} Oct 02 11:33:36 crc kubenswrapper[4929]: I1002 11:33:36.169804 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03478cff-4797-4d9b-82f6-d5588149c889" path="/var/lib/kubelet/pods/03478cff-4797-4d9b-82f6-d5588149c889/volumes" Oct 02 11:33:36 crc kubenswrapper[4929]: I1002 11:33:36.415097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerStarted","Data":"c8b786a68a0810d547f648303a29ccea6d4efcfa31e794fa4cf6a27a57b61127"} Oct 02 11:33:38 crc kubenswrapper[4929]: I1002 11:33:38.435580 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerStarted","Data":"e5d8d0d67dce0c56c6f683c231f1e918df07bdc899e3c55e672d1e0c06c2472e"} Oct 02 11:33:38 crc kubenswrapper[4929]: I1002 11:33:38.436391 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:33:38 crc kubenswrapper[4929]: I1002 11:33:38.473203 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.07193172 podStartE2EDuration="6.473179084s" podCreationTimestamp="2025-10-02 11:33:32 +0000 UTC" firstStartedPulling="2025-10-02 11:33:33.230536127 +0000 UTC m=+1413.780902491" lastFinishedPulling="2025-10-02 11:33:37.631783471 +0000 UTC m=+1418.182149855" observedRunningTime="2025-10-02 11:33:38.454339634 +0000 UTC m=+1419.004705998" watchObservedRunningTime="2025-10-02 11:33:38.473179084 +0000 UTC m=+1419.023545448" Oct 02 11:33:39 crc kubenswrapper[4929]: I1002 11:33:39.776400 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:33:39 crc kubenswrapper[4929]: I1002 11:33:39.778000 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:33:40 crc kubenswrapper[4929]: I1002 11:33:40.786078 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:40 crc kubenswrapper[4929]: I1002 11:33:40.786367 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:33:44 crc kubenswrapper[4929]: I1002 11:33:44.736355 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:33:44 crc kubenswrapper[4929]: I1002 11:33:44.737009 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.782180 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.782707 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.783794 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.783836 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.790653 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:33:49 crc kubenswrapper[4929]: I1002 11:33:49.791872 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.465902 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:33:55 crc kubenswrapper[4929]: E1002 11:33:55.466581 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="init" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.466594 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="init" Oct 02 11:33:55 crc kubenswrapper[4929]: E1002 11:33:55.466618 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="dnsmasq-dns" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.466624 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="dnsmasq-dns" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.466825 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="03478cff-4797-4d9b-82f6-d5588149c889" containerName="dnsmasq-dns" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.468162 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.492085 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.580299 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.580421 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.580553 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmbm\" (UniqueName: \"kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.681803 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmbm\" (UniqueName: \"kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.681924 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.682019 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.682417 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.682526 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.706707 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmbm\" (UniqueName: \"kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm\") pod \"redhat-operators-8fv9s\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:55 crc kubenswrapper[4929]: I1002 11:33:55.796472 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:33:56 crc kubenswrapper[4929]: I1002 11:33:56.278144 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:33:56 crc kubenswrapper[4929]: W1002 11:33:56.295435 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c12969_037f_4e43_a543_9ab439ce90aa.slice/crio-6ada0fd57dd9aed0c39f67a4cb13486ed06fa2f292e21cab24a49f1d9d79b819 WatchSource:0}: Error finding container 6ada0fd57dd9aed0c39f67a4cb13486ed06fa2f292e21cab24a49f1d9d79b819: Status 404 returned error can't find the container with id 6ada0fd57dd9aed0c39f67a4cb13486ed06fa2f292e21cab24a49f1d9d79b819 Oct 02 11:33:56 crc kubenswrapper[4929]: I1002 11:33:56.597297 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerStarted","Data":"6ada0fd57dd9aed0c39f67a4cb13486ed06fa2f292e21cab24a49f1d9d79b819"} Oct 02 11:33:57 crc kubenswrapper[4929]: I1002 11:33:57.606534 4929 generic.go:334] "Generic (PLEG): container finished" podID="36c12969-037f-4e43-a543-9ab439ce90aa" containerID="628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a" exitCode=0 Oct 02 11:33:57 crc kubenswrapper[4929]: I1002 11:33:57.606610 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerDied","Data":"628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a"} Oct 02 11:33:58 crc kubenswrapper[4929]: I1002 11:33:58.620007 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerStarted","Data":"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b"} Oct 02 11:33:59 crc kubenswrapper[4929]: I1002 11:33:59.635486 4929 generic.go:334] "Generic (PLEG): container finished" podID="36c12969-037f-4e43-a543-9ab439ce90aa" containerID="ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b" exitCode=0 Oct 02 11:33:59 crc kubenswrapper[4929]: I1002 11:33:59.635557 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerDied","Data":"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b"} Oct 02 11:34:00 crc kubenswrapper[4929]: I1002 11:34:00.648836 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerStarted","Data":"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b"} Oct 02 11:34:00 crc kubenswrapper[4929]: I1002 11:34:00.676106 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8fv9s" podStartSLOduration=3.199443095 podStartE2EDuration="5.676058749s" podCreationTimestamp="2025-10-02 11:33:55 +0000 UTC" firstStartedPulling="2025-10-02 11:33:57.608046696 +0000 UTC m=+1438.158413050" lastFinishedPulling="2025-10-02 11:34:00.08466234 +0000 UTC m=+1440.635028704" observedRunningTime="2025-10-02 11:34:00.672338922 +0000 UTC m=+1441.222705316" watchObservedRunningTime="2025-10-02 11:34:00.676058749 +0000 UTC m=+1441.226425113" Oct 02 11:34:02 crc kubenswrapper[4929]: I1002 11:34:02.763096 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:34:05 crc kubenswrapper[4929]: I1002 11:34:05.797344 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:05 crc kubenswrapper[4929]: I1002 11:34:05.797572 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:05 crc kubenswrapper[4929]: I1002 11:34:05.849060 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:06 crc kubenswrapper[4929]: I1002 11:34:06.738923 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:06 crc kubenswrapper[4929]: I1002 11:34:06.789087 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:34:08 crc kubenswrapper[4929]: I1002 11:34:08.711836 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8fv9s" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="registry-server" containerID="cri-o://e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b" gracePeriod=2 Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.184568 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.360745 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities\") pod \"36c12969-037f-4e43-a543-9ab439ce90aa\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.360810 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmbm\" (UniqueName: \"kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm\") pod \"36c12969-037f-4e43-a543-9ab439ce90aa\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.360902 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content\") pod \"36c12969-037f-4e43-a543-9ab439ce90aa\" (UID: \"36c12969-037f-4e43-a543-9ab439ce90aa\") " Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.361661 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities" (OuterVolumeSpecName: "utilities") pod "36c12969-037f-4e43-a543-9ab439ce90aa" (UID: "36c12969-037f-4e43-a543-9ab439ce90aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.362616 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.366793 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm" (OuterVolumeSpecName: "kube-api-access-fdmbm") pod "36c12969-037f-4e43-a543-9ab439ce90aa" (UID: "36c12969-037f-4e43-a543-9ab439ce90aa"). InnerVolumeSpecName "kube-api-access-fdmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.441214 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c12969-037f-4e43-a543-9ab439ce90aa" (UID: "36c12969-037f-4e43-a543-9ab439ce90aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.464292 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmbm\" (UniqueName: \"kubernetes.io/projected/36c12969-037f-4e43-a543-9ab439ce90aa-kube-api-access-fdmbm\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.464335 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c12969-037f-4e43-a543-9ab439ce90aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.722148 4929 generic.go:334] "Generic (PLEG): container finished" podID="36c12969-037f-4e43-a543-9ab439ce90aa" containerID="e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b" exitCode=0 Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.722205 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerDied","Data":"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b"} Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.722236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8fv9s" event={"ID":"36c12969-037f-4e43-a543-9ab439ce90aa","Type":"ContainerDied","Data":"6ada0fd57dd9aed0c39f67a4cb13486ed06fa2f292e21cab24a49f1d9d79b819"} Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.722256 4929 scope.go:117] "RemoveContainer" containerID="e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.722211 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8fv9s" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.761982 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.763609 4929 scope.go:117] "RemoveContainer" containerID="ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.772503 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8fv9s"] Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.783196 4929 scope.go:117] "RemoveContainer" containerID="628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.830071 4929 scope.go:117] "RemoveContainer" containerID="e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b" Oct 02 11:34:09 crc kubenswrapper[4929]: E1002 11:34:09.830515 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b\": container with ID starting with e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b not found: ID does not exist" containerID="e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.830552 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b"} err="failed to get container status \"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b\": rpc error: code = NotFound desc = could not find container \"e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b\": container with ID starting with e96048729f01ba73610261dadbc383dfc8e64206d41681896262b807ebd2661b not found: ID does not exist" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.830575 4929 scope.go:117] "RemoveContainer" containerID="ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b" Oct 02 11:34:09 crc kubenswrapper[4929]: E1002 11:34:09.830982 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b\": container with ID starting with ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b not found: ID does not exist" containerID="ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.831015 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b"} err="failed to get container status \"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b\": rpc error: code = NotFound desc = could not find container \"ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b\": container with ID starting with ab790e760a09ecaf567e9082f5f8a88b4deb712d24d7e89c1973b45ea242196b not found: ID does not exist" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.831033 4929 scope.go:117] "RemoveContainer" containerID="628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a" Oct 02 11:34:09 crc kubenswrapper[4929]: E1002 11:34:09.831423 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a\": container with ID starting with 628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a not found: ID does not exist" containerID="628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a" Oct 02 11:34:09 crc kubenswrapper[4929]: I1002 11:34:09.831454 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a"} err="failed to get container status \"628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a\": rpc error: code = NotFound desc = could not find container \"628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a\": container with ID starting with 628eee080e5f61fb72c64f42f28a3675a58b7194735f31e2971e3ab9eafc721a not found: ID does not exist" Oct 02 11:34:10 crc kubenswrapper[4929]: I1002 11:34:10.168104 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" path="/var/lib/kubelet/pods/36c12969-037f-4e43-a543-9ab439ce90aa/volumes" Oct 02 11:34:14 crc kubenswrapper[4929]: I1002 11:34:14.736626 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:34:14 crc kubenswrapper[4929]: I1002 11:34:14.737010 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.120438 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.121244 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a4d58654-fb5e-4caf-a555-0a6865d5337c" containerName="openstackclient" containerID="cri-o://f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6" gracePeriod=2 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.136945 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.137210 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="cinder-scheduler" containerID="cri-o://e0f273d8b045c5750b84ae82eb3de630e392ed10f8ecf765920cb992fe5bf07b" gracePeriod=30 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.137341 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="probe" containerID="cri-o://6b5c65593dc6d88e4e6e6322915fd952cd6ca9e932c894eb48b4bc84d07e1554" gracePeriod=30 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.157711 4929 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/ovsdbserver-nb-0" secret="" err="secret \"ovncluster-ovndbcluster-nb-dockercfg-krs2q\" not found" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.192851 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.192895 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.193280 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api-log" containerID="cri-o://00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9" gracePeriod=30 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.193723 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api" containerID="cri-o://1369b86d88547970a8f877da10a92d101eafd5fa601f783fd300d42fabfef237" gracePeriod=30 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.231504 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.233595 4929 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openstack/ovndbcluster-nb-etc-ovn-ovsdbserver-nb-0: PVC is being deleted" pod="openstack/ovsdbserver-nb-0" volumeName="ovndbcluster-nb-etc-ovn" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.252530 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.273931 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.275050 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="registry-server" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275069 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="registry-server" Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.275082 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="extract-content" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275090 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="extract-content" Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.275107 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d58654-fb5e-4caf-a555-0a6865d5337c" containerName="openstackclient" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275118 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d58654-fb5e-4caf-a555-0a6865d5337c" containerName="openstackclient" Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.275153 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="extract-utilities" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275162 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="extract-utilities" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275398 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c12969-037f-4e43-a543-9ab439ce90aa" containerName="registry-server" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.275420 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d58654-fb5e-4caf-a555-0a6865d5337c" containerName="openstackclient" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.282459 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.314125 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.314198 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:22.814182793 +0000 UTC m=+1463.364549147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.321107 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.360743 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.361548 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="openstack-network-exporter" containerID="cri-o://b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4" gracePeriod=300 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.418193 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6n67\" (UniqueName: \"kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67\") pod \"placement5c11-account-delete-cw9t6\" (UID: \"e7503492-8d47-4852-aca4-0bb661665127\") " pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.465224 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.466772 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.519923 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.529999 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6n67\" (UniqueName: \"kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67\") pod \"placement5c11-account-delete-cw9t6\" (UID: \"e7503492-8d47-4852-aca4-0bb661665127\") " pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.542025 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.543567 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.567719 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.604885 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6n67\" (UniqueName: \"kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67\") pod \"placement5c11-account-delete-cw9t6\" (UID: \"e7503492-8d47-4852-aca4-0bb661665127\") " pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.612525 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-j8b9x"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.637559 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7bzd\" (UniqueName: \"kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd\") pod \"glancea7fb-account-delete-pvxdd\" (UID: \"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f\") " pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.637658 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tts4z\" (UniqueName: \"kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z\") pod \"barbican9b10-account-delete-8lnrl\" (UID: \"7d52b938-d877-46ba-b19c-7e6331422d01\") " pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.680164 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="ovsdbserver-sb" containerID="cri-o://45099a9a81d331acf15cd5d4cf4ab34cdedd4a4c511ece106065205a559fb3ec" gracePeriod=300 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.690208 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-j8b9x"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.702503 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.732040 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.733782 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.754174 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7bzd\" (UniqueName: \"kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd\") pod \"glancea7fb-account-delete-pvxdd\" (UID: \"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f\") " pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.754486 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tts4z\" (UniqueName: \"kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z\") pod \"barbican9b10-account-delete-8lnrl\" (UID: \"7d52b938-d877-46ba-b19c-7e6331422d01\") " pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.790376 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.856662 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl\") pod \"neutron7b2b-account-delete-ghckr\" (UID: \"ae9c788d-5f22-443d-aa60-2f9e88dce9fd\") " pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.856975 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:22 crc kubenswrapper[4929]: E1002 11:34:22.857025 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:23.85700736 +0000 UTC m=+1464.407373724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.914389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerDied","Data":"00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9"} Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.914433 4929 generic.go:334] "Generic (PLEG): container finished" podID="39949247-a1b3-41bc-a94a-4c59049576cd" containerID="00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9" exitCode=143 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.930156 4929 generic.go:334] "Generic (PLEG): container finished" podID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerID="b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4" exitCode=2 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.930853 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="openstack-network-exporter" containerID="cri-o://31ba20b248cec7058dba917dcf36d5e4fb82da3a9e74b7f3c12e428f1868b6d2" gracePeriod=300 Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.931160 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerDied","Data":"b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4"} Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.940484 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tts4z\" (UniqueName: \"kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z\") pod \"barbican9b10-account-delete-8lnrl\" (UID: \"7d52b938-d877-46ba-b19c-7e6331422d01\") " pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.945683 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7bzd\" (UniqueName: \"kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd\") pod \"glancea7fb-account-delete-pvxdd\" (UID: \"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f\") " pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.947370 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:22 crc kubenswrapper[4929]: I1002 11:34:22.959304 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl\") pod \"neutron7b2b-account-delete-ghckr\" (UID: \"ae9c788d-5f22-443d-aa60-2f9e88dce9fd\") " pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.021112 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl\") pod \"neutron7b2b-account-delete-ghckr\" (UID: \"ae9c788d-5f22-443d-aa60-2f9e88dce9fd\") " pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.029590 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.029857 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" containerID="cri-o://c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.030129 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="openstack-network-exporter" containerID="cri-o://a800d27ad9ba4905470d759a654c04cea37a9ca62559cf4a2feee8d6683bdd38" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.129552 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6zpcs"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.151344 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8rg59"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.155370 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="ovsdbserver-nb" containerID="cri-o://0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" gracePeriod=300 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.167615 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.170951 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6zpcs"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.217036 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8rg59"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.261921 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.267415 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.288698 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.316226 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.410715 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.411893 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.433634 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.437089 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcx7v\" (UniqueName: \"kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v\") pod \"novacell1b9e6-account-delete-5mvft\" (UID: \"44d85c4b-9da3-40d5-a5c3-0aeac38eecee\") " pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.460993 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.461248 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-ld7dp" podUID="49d60065-8bbd-4182-be31-c0f851790792" containerName="openstack-network-exporter" containerID="cri-o://0d5be4bb5d6960bc1b5676ae7124167aab77ea28d2f3f417b7410eab7da60d97" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.469685 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.481132 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qlnn8"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.498142 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zr8dn"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.523063 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.524415 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.539648 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgvp\" (UniqueName: \"kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp\") pod \"novacell091e2-account-delete-5d9qb\" (UID: \"45f95478-d16b-4ffb-9389-f68851cce4a6\") " pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.539762 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9qsr\" (UniqueName: \"kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr\") pod \"novaapi6c15-account-delete-bdxsh\" (UID: \"a18e7ab5-8994-4a34-98d2-0e65bbfc4068\") " pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.540136 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcx7v\" (UniqueName: \"kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v\") pod \"novacell1b9e6-account-delete-5mvft\" (UID: \"44d85c4b-9da3-40d5-a5c3-0aeac38eecee\") " pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.551090 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zr8dn"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.560696 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qlnn8"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.570600 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.588510 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9tlgd"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.608004 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.608320 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-99d9d588b-ddwr8" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-log" containerID="cri-o://fbb7ed5b03ef1c144f9d8326fca0faa2b3ea252a058e454c2f5d13b47dbb1af1" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.608714 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-99d9d588b-ddwr8" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-api" containerID="cri-o://157276ef9d3a545d2f5ce4288c1bab5d100b24eab57de2c4e08c2b13bd82b387" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.610637 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcx7v\" (UniqueName: \"kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v\") pod \"novacell1b9e6-account-delete-5mvft\" (UID: \"44d85c4b-9da3-40d5-a5c3-0aeac38eecee\") " pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.618298 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9tlgd"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.638452 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639113 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-server" containerID="cri-o://c422966ff4f7462ce3b91b168f5ac09e2c599380e12f670a2a65404aef3dd588" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639138 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-server" containerID="cri-o://4c94d3591c6c3e15abde4c9e9bda1a1d6451806b1b6b0c671dfee4007ae1a8e3" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639238 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-updater" containerID="cri-o://50576db641077b7a1e9dbf23a9fc5b7cda23206b43329a6994ae96a7b01bca1b" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639304 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-auditor" containerID="cri-o://7fc56f6f53a6276996cea4cb299fd663fe0652dc22f0c71496b40d33cbd4a999" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639352 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-replicator" containerID="cri-o://d8d4bec396dfc299189ef1b9a62ea5ec2484a5fe0492556f6ab9d91d861c28eb" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639392 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-server" containerID="cri-o://b84dbe7e64f60d151dd9bf83d9d60d85c9c02ab74df4e15e5302c38e6a6cc41c" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639429 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-reaper" containerID="cri-o://8035c6c8bfb09505e74f61334bb079defd5376ea17a1624b860ad93bb160b4a7" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639458 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-auditor" containerID="cri-o://cefdca76aa5689375d6555f161f627d6003d8fac34c8df5144f6cacb5ac6866c" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639851 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-replicator" containerID="cri-o://908ca2fd69c5108943cc878199a4b51016780d034eeff8322db65f1600694e85" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639922 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-expirer" containerID="cri-o://1d45e6424e955430dffa5579cf4f5d18c47a7931b6e630ff334c44c39257c19c" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.639980 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="swift-recon-cron" containerID="cri-o://ad49c21a672c805ec312d5fb5f9c9032867c22231864156a14347d73f9b26ac2" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.640035 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="rsync" containerID="cri-o://d463cb431a68d8c8a6bc8838afb25b1c342f4bd84ce2023c1ef8358a6d79a0eb" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.640070 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-auditor" containerID="cri-o://8e7f1a184638b273c379d892f5706e40b2b0e6e8a03f5d40e8cf8e31bb64e072" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.640101 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-updater" containerID="cri-o://e4b14c7b773820d673e84708764e3207f484e07e6581b05635251acbe436a01b" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.640127 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-replicator" containerID="cri-o://b60072b033a9f359be40981e533a243a7b09e82e55c795215b5c3ac05b529145" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.642120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgvp\" (UniqueName: \"kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp\") pod \"novacell091e2-account-delete-5d9qb\" (UID: \"45f95478-d16b-4ffb-9389-f68851cce4a6\") " pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.642182 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9qsr\" (UniqueName: \"kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr\") pod \"novaapi6c15-account-delete-bdxsh\" (UID: \"a18e7ab5-8994-4a34-98d2-0e65bbfc4068\") " pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.677644 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vmt6n"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.688337 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mxwxk"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.704530 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vmt6n"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.706490 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9qsr\" (UniqueName: \"kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr\") pod \"novaapi6c15-account-delete-bdxsh\" (UID: \"a18e7ab5-8994-4a34-98d2-0e65bbfc4068\") " pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:23 crc kubenswrapper[4929]: E1002 11:34:23.714613 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39949247_a1b3_41bc_a94a_4c59049576cd.slice/crio-conmon-00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8946c48_0a50_449c_b64a_e8e4ae2f84ba.slice/crio-b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8946c48_0a50_449c_b64a_e8e4ae2f84ba.slice/crio-conmon-b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.716799 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mxwxk"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.730555 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.730818 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="dnsmasq-dns" containerID="cri-o://aae3e40e81be663620ecc6be606854b80d924a650235106f6750681901686f12" gracePeriod=10 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.747526 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgvp\" (UniqueName: \"kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp\") pod \"novacell091e2-account-delete-5d9qb\" (UID: \"45f95478-d16b-4ffb-9389-f68851cce4a6\") " pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.799610 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hl5hq"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.820783 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hl5hq"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.831294 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.845547 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5c11-account-create-8bfzg"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.862287 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5c11-account-create-8bfzg"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.894392 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.894610 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-log" containerID="cri-o://baa739f57ecb397b52caaf29fc37695b9a4448c825f0be5520586e3d2f8dccf3" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.895051 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-httpd" containerID="cri-o://41c95ee7304d226a10f65f9f4eb7f25c911fc6a7c2f8bd69eab49e70a8a3e99a" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.911454 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.911893 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f54bbfbbc-rzbv9" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-api" containerID="cri-o://52e15741d914815b2fb093a46215236fea49e8f8564b50718e5c10df7b9ff3e8" gracePeriod=30 Oct 02 11:34:23 crc kubenswrapper[4929]: I1002 11:34:23.912084 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f54bbfbbc-rzbv9" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-httpd" containerID="cri-o://4a26eb13a68fc86fca37ccadbc35bdf199a826d5b4a5034fe350778970631e25" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.017463 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.017900 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:26.017874167 +0000 UTC m=+1466.568240531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.116514 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773 is running failed: container process not found" containerID="0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.120309 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773 is running failed: container process not found" containerID="0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.121154 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773 is running failed: container process not found" containerID="0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.121224 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="ovsdbserver-nb" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.124742 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e8946c48-0a50-449c-b64a-e8e4ae2f84ba/ovsdbserver-sb/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.124794 4929 generic.go:334] "Generic (PLEG): container finished" podID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerID="45099a9a81d331acf15cd5d4cf4ab34cdedd4a4c511ece106065205a559fb3ec" exitCode=143 Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.144293 4929 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.173:41304->38.102.83.173:39349: write tcp 38.102.83.173:41304->38.102.83.173:39349: write: broken pipe Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.144396 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerDied","Data":"45099a9a81d331acf15cd5d4cf4ab34cdedd4a4c511ece106065205a559fb3ec"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.148890 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.149188 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-log" containerID="cri-o://451546a934f38d915cbb04879f0147f97d390d02d97e32ed999e258f1445f92c" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.149610 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-httpd" containerID="cri-o://d65560d220b0508cdd18383acef57b7ee3e6f336bff1911b49d9df550f30b608" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.160543 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.168633 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d4a10ac0-e47f-47cf-9779-d60c30b14755/ovsdbserver-nb/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.168675 4929 generic.go:334] "Generic (PLEG): container finished" podID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerID="31ba20b248cec7058dba917dcf36d5e4fb82da3a9e74b7f3c12e428f1868b6d2" exitCode=2 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.168693 4929 generic.go:334] "Generic (PLEG): container finished" podID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerID="0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" exitCode=143 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.180358 4929 generic.go:334] "Generic (PLEG): container finished" podID="692c9c38-07d7-455f-8d9c-984904aef051" containerID="aae3e40e81be663620ecc6be606854b80d924a650235106f6750681901686f12" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.183570 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f8a714-fde6-45a2-be8f-8655ab68bb45" path="/var/lib/kubelet/pods/28f8a714-fde6-45a2-be8f-8655ab68bb45/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.184379 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29989848-bd4f-4d93-a71f-95965ef153e8" path="/var/lib/kubelet/pods/29989848-bd4f-4d93-a71f-95965ef153e8/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.184899 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bfcad7-d630-4361-b28d-f072ac3f84a0" path="/var/lib/kubelet/pods/69bfcad7-d630-4361-b28d-f072ac3f84a0/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.186330 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5" path="/var/lib/kubelet/pods/6c9b80a9-e1cb-4492-ad9b-ff8b771fcce5/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.186486 4929 generic.go:334] "Generic (PLEG): container finished" podID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerID="fbb7ed5b03ef1c144f9d8326fca0faa2b3ea252a058e454c2f5d13b47dbb1af1" exitCode=143 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.186908 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892a7315-3f0a-4523-9c05-3a9a8ca321b5" path="/var/lib/kubelet/pods/892a7315-3f0a-4523-9c05-3a9a8ca321b5/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.187416 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89926e8c-7fcf-4877-8be1-8bce650c2ca1" path="/var/lib/kubelet/pods/89926e8c-7fcf-4877-8be1-8bce650c2ca1/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.187911 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f111be9-1a7c-4249-9cd0-26df16b3bf63" path="/var/lib/kubelet/pods/8f111be9-1a7c-4249-9cd0-26df16b3bf63/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.190557 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b3c946-3ee6-4320-ab89-6fb932ec3292" path="/var/lib/kubelet/pods/d1b3c946-3ee6-4320-ab89-6fb932ec3292/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.193250 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73a449d-0b0f-40a9-9cc7-5e44447b2c86" path="/var/lib/kubelet/pods/d73a449d-0b0f-40a9-9cc7-5e44447b2c86/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.193812 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de08030d-c53a-49cc-906e-10b22cd577e1" path="/var/lib/kubelet/pods/de08030d-c53a-49cc-906e-10b22cd577e1/volumes" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.196952 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerDied","Data":"31ba20b248cec7058dba917dcf36d5e4fb82da3a9e74b7f3c12e428f1868b6d2"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.197083 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerDied","Data":"0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.197097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" event={"ID":"692c9c38-07d7-455f-8d9c-984904aef051","Type":"ContainerDied","Data":"aae3e40e81be663620ecc6be606854b80d924a650235106f6750681901686f12"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.197109 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerDied","Data":"fbb7ed5b03ef1c144f9d8326fca0faa2b3ea252a058e454c2f5d13b47dbb1af1"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.197243 4929 generic.go:334] "Generic (PLEG): container finished" podID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerID="6b5c65593dc6d88e4e6e6322915fd952cd6ca9e932c894eb48b4bc84d07e1554" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.197324 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerDied","Data":"6b5c65593dc6d88e4e6e6322915fd952cd6ca9e932c894eb48b4bc84d07e1554"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.202178 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-n7hrt"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207410 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="1d45e6424e955430dffa5579cf4f5d18c47a7931b6e630ff334c44c39257c19c" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207438 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="e4b14c7b773820d673e84708764e3207f484e07e6581b05635251acbe436a01b" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207447 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="8e7f1a184638b273c379d892f5706e40b2b0e6e8a03f5d40e8cf8e31bb64e072" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207453 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="b60072b033a9f359be40981e533a243a7b09e82e55c795215b5c3ac05b529145" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207460 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="50576db641077b7a1e9dbf23a9fc5b7cda23206b43329a6994ae96a7b01bca1b" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207469 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="8035c6c8bfb09505e74f61334bb079defd5376ea17a1624b860ad93bb160b4a7" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207475 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="cefdca76aa5689375d6555f161f627d6003d8fac34c8df5144f6cacb5ac6866c" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207482 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="908ca2fd69c5108943cc878199a4b51016780d034eeff8322db65f1600694e85" exitCode=0 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207521 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"1d45e6424e955430dffa5579cf4f5d18c47a7931b6e630ff334c44c39257c19c"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"e4b14c7b773820d673e84708764e3207f484e07e6581b05635251acbe436a01b"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207554 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"8e7f1a184638b273c379d892f5706e40b2b0e6e8a03f5d40e8cf8e31bb64e072"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207563 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"b60072b033a9f359be40981e533a243a7b09e82e55c795215b5c3ac05b529145"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207598 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"50576db641077b7a1e9dbf23a9fc5b7cda23206b43329a6994ae96a7b01bca1b"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207608 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"8035c6c8bfb09505e74f61334bb079defd5376ea17a1624b860ad93bb160b4a7"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207619 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"cefdca76aa5689375d6555f161f627d6003d8fac34c8df5144f6cacb5ac6866c"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.207628 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"908ca2fd69c5108943cc878199a4b51016780d034eeff8322db65f1600694e85"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.209063 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ld7dp_49d60065-8bbd-4182-be31-c0f851790792/openstack-network-exporter/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.209095 4929 generic.go:334] "Generic (PLEG): container finished" podID="49d60065-8bbd-4182-be31-c0f851790792" containerID="0d5be4bb5d6960bc1b5676ae7124167aab77ea28d2f3f417b7410eab7da60d97" exitCode=2 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.209129 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ld7dp" event={"ID":"49d60065-8bbd-4182-be31-c0f851790792","Type":"ContainerDied","Data":"0d5be4bb5d6960bc1b5676ae7124167aab77ea28d2f3f417b7410eab7da60d97"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.212372 4929 generic.go:334] "Generic (PLEG): container finished" podID="61e50682-8502-4570-916a-a3b90a5218e4" containerID="a800d27ad9ba4905470d759a654c04cea37a9ca62559cf4a2feee8d6683bdd38" exitCode=2 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.212417 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerDied","Data":"a800d27ad9ba4905470d759a654c04cea37a9ca62559cf4a2feee8d6683bdd38"} Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.218308 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-n7hrt"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.229466 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.242566 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-9b10-account-create-672ps"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.244375 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.253179 4929 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell1b9e6-account-delete-5mvft" secret="" err="secret \"galera-openstack-cell1-dockercfg-4n9fb\" not found" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.253247 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.259610 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-9b10-account-create-672ps"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.261680 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.272292 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ld7dp_49d60065-8bbd-4182-be31-c0f851790792/openstack-network-exporter/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.272561 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.276313 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-84bfc"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.299301 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-84bfc"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.318778 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0fa5-account-create-tnbhh"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.328775 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0fa5-account-create-tnbhh"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.336094 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cmqkl"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345257 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345366 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345411 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345438 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345486 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9q4j\" (UniqueName: \"kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.345520 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir\") pod \"49d60065-8bbd-4182-be31-c0f851790792\" (UID: \"49d60065-8bbd-4182-be31-c0f851790792\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.346781 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config" (OuterVolumeSpecName: "config") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.346829 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.349786 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cmqkl"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.349788 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.357516 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j" (OuterVolumeSpecName: "kube-api-access-s9q4j") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "kube-api-access-s9q4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.384422 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.391259 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a7fb-account-create-kq6p5"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.404873 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" containerID="cri-o://23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.413950 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a7fb-account-create-kq6p5"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.449306 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.449577 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49d60065-8bbd-4182-be31-c0f851790792-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.449587 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9q4j\" (UniqueName: \"kubernetes.io/projected/49d60065-8bbd-4182-be31-c0f851790792-kube-api-access-s9q4j\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.449595 4929 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/49d60065-8bbd-4182-be31-c0f851790792-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.468389 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.469124 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.469351 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-log" containerID="cri-o://3153b1529e0c32bf3f30edef4acbd966c8aa5b2583cf6efe7dd0a6f5cab02ebd" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.470263 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-metadata" containerID="cri-o://b99a17b7ec88652946955b5fdf985f5b9d3d8bd15ef24dfadcf98117eac94d02" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.477176 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.477554 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69566f664c-jps5x" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-httpd" containerID="cri-o://a9694eb8911f9ff8551424e463b8bef808eae32090d923b999eba6998b0901c3" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.477748 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69566f664c-jps5x" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-server" containerID="cri-o://027e91ba08dcef685cce7a361a75e6c896ae57660ea313b7e71bbff0e07f1279" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.500123 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.500345 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener-log" containerID="cri-o://e3c00d90ab5c8fdbfb94fad352ef76e3b0dd878ba364460bbb331b6a693a2e07" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.500454 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener" containerID="cri-o://67d3645ec9cfed216d6036455755f8b22923aae7acb7d9366f616685db4f7af8" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.530701 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.554444 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.559340 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.559657 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-log" containerID="cri-o://a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.560338 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-api" containerID="cri-o://ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.575477 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-m8bs6"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.577157 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "49d60065-8bbd-4182-be31-c0f851790792" (UID: "49d60065-8bbd-4182-be31-c0f851790792"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.586313 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.595081 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-m8bs6"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.603752 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b9e6-account-create-gtt87"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.619361 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5r4p6"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.643743 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.643942 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker-log" containerID="cri-o://241b3468cb9d97e9ba6f173143b44936f01809a56d73599a592eacd87d2efe4d" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.644423 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker" containerID="cri-o://50ab6d4116dc1e4db95a4dd8529214c90a135250b4f2785bbf13989c07ed52bf" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.649877 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b9e6-account-create-gtt87"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.657251 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/49d60065-8bbd-4182-be31-c0f851790792-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.659716 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5r4p6"] Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.662657 4929 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 02 11:34:24 crc kubenswrapper[4929]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:34:24 crc kubenswrapper[4929]: + source /usr/local/bin/container-scripts/functions Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNBridge=br-int Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNEncapType=geneve Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNAvailabilityZones= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ EnableChassisAsGateway=true Oct 02 11:34:24 crc kubenswrapper[4929]: ++ PhysicalNetworks= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNHostName= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:34:24 crc kubenswrapper[4929]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:34:24 crc kubenswrapper[4929]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + cleanup_ovsdb_server_semaphore Oct 02 11:34:24 crc kubenswrapper[4929]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:34:24 crc kubenswrapper[4929]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-fv8ff" message=< Oct 02 11:34:24 crc kubenswrapper[4929]: Exiting ovsdb-server (5) [ OK ] Oct 02 11:34:24 crc kubenswrapper[4929]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:34:24 crc kubenswrapper[4929]: + source /usr/local/bin/container-scripts/functions Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNBridge=br-int Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNEncapType=geneve Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNAvailabilityZones= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ EnableChassisAsGateway=true Oct 02 11:34:24 crc kubenswrapper[4929]: ++ PhysicalNetworks= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNHostName= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:34:24 crc kubenswrapper[4929]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:34:24 crc kubenswrapper[4929]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + cleanup_ovsdb_server_semaphore Oct 02 11:34:24 crc kubenswrapper[4929]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:34:24 crc kubenswrapper[4929]: > Oct 02 11:34:24 crc kubenswrapper[4929]: E1002 11:34:24.662689 4929 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 02 11:34:24 crc kubenswrapper[4929]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:34:24 crc kubenswrapper[4929]: + source /usr/local/bin/container-scripts/functions Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNBridge=br-int Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNEncapType=geneve Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNAvailabilityZones= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ EnableChassisAsGateway=true Oct 02 11:34:24 crc kubenswrapper[4929]: ++ PhysicalNetworks= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ OVNHostName= Oct 02 11:34:24 crc kubenswrapper[4929]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:34:24 crc kubenswrapper[4929]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:34:24 crc kubenswrapper[4929]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:34:24 crc kubenswrapper[4929]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + sleep 0.5 Oct 02 11:34:24 crc kubenswrapper[4929]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:34:24 crc kubenswrapper[4929]: + cleanup_ovsdb_server_semaphore Oct 02 11:34:24 crc kubenswrapper[4929]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:34:24 crc kubenswrapper[4929]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:34:24 crc kubenswrapper[4929]: > pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" containerID="cri-o://5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.662726 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" containerID="cri-o://5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" gracePeriod=29 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.669236 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6c15-account-create-zstdt"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.678753 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6c15-account-create-zstdt"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.688599 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.696321 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-91e2-account-create-kh629"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.703449 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e8946c48-0a50-449c-b64a-e8e4ae2f84ba/ovsdbserver-sb/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.703531 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.703596 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-91e2-account-create-kh629"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.710580 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.717220 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.722861 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d58dd68b-qlcrz" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api-log" containerID="cri-o://21cd6c9af3eab7ce82f56ddfb8a37ffb6223a10844bf34a7a2c8ec24da313794" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.722873 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d58dd68b-qlcrz" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api" containerID="cri-o://e53b3ad5b5ce14f2519bfc0e1a58672bd568af0b97109869b7485930f064cca6" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.725740 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vcg7z"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.748924 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vcg7z"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759015 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759207 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759231 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759307 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qdvn\" (UniqueName: \"kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759388 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759403 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759513 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.759543 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\" (UID: \"e8946c48-0a50-449c-b64a-e8e4ae2f84ba\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.760982 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts" (OuterVolumeSpecName: "scripts") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.761446 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config" (OuterVolumeSpecName: "config") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.770458 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.773752 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn" (OuterVolumeSpecName: "kube-api-access-2qdvn") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "kube-api-access-2qdvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.779104 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.779306 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e86f887d-db93-49c4-85ed-add5f01b25f7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.791694 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.834112 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.834534 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0ba6072c-759c-4261-8107-8243d262003d" containerName="nova-scheduler-scheduler" containerID="cri-o://212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.863628 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.863663 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.863678 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qdvn\" (UniqueName: \"kubernetes.io/projected/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-kube-api-access-2qdvn\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.863692 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.863702 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.864973 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.892875 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.893281 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.898407 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7gbc"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.904222 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j7gbc"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.905502 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.905945 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="galera" containerID="cri-o://2b859fd219d68a03c833c80a4486933cb925eae152050cfa50df66277e417160" gracePeriod=30 Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.938071 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d4a10ac0-e47f-47cf-9779-d60c30b14755/ovsdbserver-nb/0.log" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.948666 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.967008 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8mpd\" (UniqueName: \"kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.967317 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.967454 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.967597 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.968185 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.968327 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb\") pod \"692c9c38-07d7-455f-8d9c-984904aef051\" (UID: \"692c9c38-07d7-455f-8d9c-984904aef051\") " Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.979213 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p5n6r"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.987841 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.993692 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:34:24 crc kubenswrapper[4929]: I1002 11:34:24.994700 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" containerID="cri-o://8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" gracePeriod=30 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.016299 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd" (OuterVolumeSpecName: "kube-api-access-s8mpd") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "kube-api-access-s8mpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.019406 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p5n6r"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.069019 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.072642 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26fx6\" (UniqueName: \"kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090172 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090195 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090216 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090311 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090353 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090391 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090439 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs\") pod \"d4a10ac0-e47f-47cf-9779-d60c30b14755\" (UID: \"d4a10ac0-e47f-47cf-9779-d60c30b14755\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090834 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.090855 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8mpd\" (UniqueName: \"kubernetes.io/projected/692c9c38-07d7-455f-8d9c-984904aef051-kube-api-access-s8mpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.092222 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.092560 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.092915 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config" (OuterVolumeSpecName: "config") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.093099 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts" (OuterVolumeSpecName: "scripts") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.102089 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.105847 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6" (OuterVolumeSpecName: "kube-api-access-26fx6") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "kube-api-access-26fx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.130331 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e8946c48-0a50-449c-b64a-e8e4ae2f84ba" (UID: "e8946c48-0a50-449c-b64a-e8e4ae2f84ba"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.162434 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.170667 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config" (OuterVolumeSpecName: "config") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.175386 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.178946 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.187041 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192312 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192383 4929 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192405 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192414 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192423 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8946c48-0a50-449c-b64a-e8e4ae2f84ba-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192433 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192441 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26fx6\" (UniqueName: \"kubernetes.io/projected/d4a10ac0-e47f-47cf-9779-d60c30b14755-kube-api-access-26fx6\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192449 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192458 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192466 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4a10ac0-e47f-47cf-9779-d60c30b14755-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192491 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.192501 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.197479 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "692c9c38-07d7-455f-8d9c-984904aef051" (UID: "692c9c38-07d7-455f-8d9c-984904aef051"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.228560 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.296343 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle\") pod \"a4d58654-fb5e-4caf-a555-0a6865d5337c\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.296691 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config\") pod \"a4d58654-fb5e-4caf-a555-0a6865d5337c\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.296759 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj9s4\" (UniqueName: \"kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4\") pod \"a4d58654-fb5e-4caf-a555-0a6865d5337c\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.296824 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret\") pod \"a4d58654-fb5e-4caf-a555-0a6865d5337c\" (UID: \"a4d58654-fb5e-4caf-a555-0a6865d5337c\") " Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.297343 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.297359 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/692c9c38-07d7-455f-8d9c-984904aef051-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.297368 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.305443 4929 generic.go:334] "Generic (PLEG): container finished" podID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerID="baa739f57ecb397b52caaf29fc37695b9a4448c825f0be5520586e3d2f8dccf3" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.305690 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerDied","Data":"baa739f57ecb397b52caaf29fc37695b9a4448c825f0be5520586e3d2f8dccf3"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.316326 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.318404 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.319528 4929 generic.go:334] "Generic (PLEG): container finished" podID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerID="a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.319609 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerDied","Data":"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.326785 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4" (OuterVolumeSpecName: "kube-api-access-lj9s4") pod "a4d58654-fb5e-4caf-a555-0a6865d5337c" (UID: "a4d58654-fb5e-4caf-a555-0a6865d5337c"). InnerVolumeSpecName "kube-api-access-lj9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.329371 4929 generic.go:334] "Generic (PLEG): container finished" podID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerID="451546a934f38d915cbb04879f0147f97d390d02d97e32ed999e258f1445f92c" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.329423 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerDied","Data":"451546a934f38d915cbb04879f0147f97d390d02d97e32ed999e258f1445f92c"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.342566 4929 generic.go:334] "Generic (PLEG): container finished" podID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerID="21cd6c9af3eab7ce82f56ddfb8a37ffb6223a10844bf34a7a2c8ec24da313794" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.342757 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerDied","Data":"21cd6c9af3eab7ce82f56ddfb8a37ffb6223a10844bf34a7a2c8ec24da313794"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.343573 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.350711 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ld7dp_49d60065-8bbd-4182-be31-c0f851790792/openstack-network-exporter/0.log" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.351555 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ld7dp" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.353232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ld7dp" event={"ID":"49d60065-8bbd-4182-be31-c0f851790792","Type":"ContainerDied","Data":"e08e1c3c89fd6abea1f07639ead7380e935d6a53a8c6b44a0f36153d671d3f11"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.353358 4929 scope.go:117] "RemoveContainer" containerID="0d5be4bb5d6960bc1b5676ae7124167aab77ea28d2f3f417b7410eab7da60d97" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.364219 4929 generic.go:334] "Generic (PLEG): container finished" podID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerID="241b3468cb9d97e9ba6f173143b44936f01809a56d73599a592eacd87d2efe4d" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.364390 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerDied","Data":"241b3468cb9d97e9ba6f173143b44936f01809a56d73599a592eacd87d2efe4d"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.374551 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a4d58654-fb5e-4caf-a555-0a6865d5337c" (UID: "a4d58654-fb5e-4caf-a555-0a6865d5337c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.375721 4929 generic.go:334] "Generic (PLEG): container finished" podID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerID="3153b1529e0c32bf3f30edef4acbd966c8aa5b2583cf6efe7dd0a6f5cab02ebd" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.375849 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerDied","Data":"3153b1529e0c32bf3f30edef4acbd966c8aa5b2583cf6efe7dd0a6f5cab02ebd"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.393522 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.399512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4d58654-fb5e-4caf-a555-0a6865d5337c" (UID: "a4d58654-fb5e-4caf-a555-0a6865d5337c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.401670 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj9s4\" (UniqueName: \"kubernetes.io/projected/a4d58654-fb5e-4caf-a555-0a6865d5337c-kube-api-access-lj9s4\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.401839 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.401913 4929 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.402006 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.402758 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.402917 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data podName:dfb673e7-59bc-41d1-9bf0-d20527c4a740 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:25.90289697 +0000 UTC m=+1466.453263334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data") pod "rabbitmq-cell1-server-0" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.405473 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e8946c48-0a50-449c-b64a-e8e4ae2f84ba/ovsdbserver-sb/0.log" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.405630 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.407910 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8946c48-0a50-449c-b64a-e8e4ae2f84ba","Type":"ContainerDied","Data":"f9bea5dcf2e3eda55b2a4b8dd153572fc11deb39fa45a41ab37dcfd6e9d95f97"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.415633 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.417453 4929 scope.go:117] "RemoveContainer" containerID="b6f359634a77f769f8031f3619dfe1d92d8655ecb9ac23aa25ab0dfe2e7931a4" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.419388 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.427241 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-ld7dp"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.446168 4929 generic.go:334] "Generic (PLEG): container finished" podID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerID="e3c00d90ab5c8fdbfb94fad352ef76e3b0dd878ba364460bbb331b6a693a2e07" exitCode=143 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.446376 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerDied","Data":"e3c00d90ab5c8fdbfb94fad352ef76e3b0dd878ba364460bbb331b6a693a2e07"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.482520 4929 generic.go:334] "Generic (PLEG): container finished" podID="a4d58654-fb5e-4caf-a555-0a6865d5337c" containerID="f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6" exitCode=137 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.482720 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.518087 4929 generic.go:334] "Generic (PLEG): container finished" podID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerID="027e91ba08dcef685cce7a361a75e6c896ae57660ea313b7e71bbff0e07f1279" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.518138 4929 generic.go:334] "Generic (PLEG): container finished" podID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerID="a9694eb8911f9ff8551424e463b8bef808eae32090d923b999eba6998b0901c3" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.518216 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerDied","Data":"027e91ba08dcef685cce7a361a75e6c896ae57660ea313b7e71bbff0e07f1279"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.518249 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerDied","Data":"a9694eb8911f9ff8551424e463b8bef808eae32090d923b999eba6998b0901c3"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.576631 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.581824 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d4a10ac0-e47f-47cf-9779-d60c30b14755/ovsdbserver-nb/0.log" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.581945 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d4a10ac0-e47f-47cf-9779-d60c30b14755","Type":"ContainerDied","Data":"cd475f3b9f729cb1b71f3428fd7ec5c6534a3c592ebeea0d48699cbcb45c3273"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.582082 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.622642 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.642780 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a4d58654-fb5e-4caf-a555-0a6865d5337c" (UID: "a4d58654-fb5e-4caf-a555-0a6865d5337c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.642934 4929 generic.go:334] "Generic (PLEG): container finished" podID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.643022 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerDied","Data":"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.660117 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d4a10ac0-e47f-47cf-9779-d60c30b14755" (UID: "d4a10ac0-e47f-47cf-9779-d60c30b14755"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.722336 4929 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a4d58654-fb5e-4caf-a555-0a6865d5337c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.722641 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4a10ac0-e47f-47cf-9779-d60c30b14755-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.778716 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.779768 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782297 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="d463cb431a68d8c8a6bc8838afb25b1c342f4bd84ce2023c1ef8358a6d79a0eb" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782419 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="4c94d3591c6c3e15abde4c9e9bda1a1d6451806b1b6b0c671dfee4007ae1a8e3" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782490 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="7fc56f6f53a6276996cea4cb299fd663fe0652dc22f0c71496b40d33cbd4a999" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782557 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="d8d4bec396dfc299189ef1b9a62ea5ec2484a5fe0492556f6ab9d91d861c28eb" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782626 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="b84dbe7e64f60d151dd9bf83d9d60d85c9c02ab74df4e15e5302c38e6a6cc41c" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782709 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="c422966ff4f7462ce3b91b168f5ac09e2c599380e12f670a2a65404aef3dd588" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782824 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"d463cb431a68d8c8a6bc8838afb25b1c342f4bd84ce2023c1ef8358a6d79a0eb"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.782912 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"4c94d3591c6c3e15abde4c9e9bda1a1d6451806b1b6b0c671dfee4007ae1a8e3"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.783093 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"7fc56f6f53a6276996cea4cb299fd663fe0652dc22f0c71496b40d33cbd4a999"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.783178 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"d8d4bec396dfc299189ef1b9a62ea5ec2484a5fe0492556f6ab9d91d861c28eb"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.783266 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"b84dbe7e64f60d151dd9bf83d9d60d85c9c02ab74df4e15e5302c38e6a6cc41c"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.783338 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"c422966ff4f7462ce3b91b168f5ac09e2c599380e12f670a2a65404aef3dd588"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.799934 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.838088 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c11-account-delete-cw9t6" event={"ID":"e7503492-8d47-4852-aca4-0bb661665127","Type":"ContainerStarted","Data":"6492ca5a8cfa08ed0e452342878ec82ca80db26bbbc63b72d6819758570fa397"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.838131 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerDied","Data":"4a26eb13a68fc86fca37ccadbc35bdf199a826d5b4a5034fe350778970631e25"} Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.813761 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.834537 4929 generic.go:334] "Generic (PLEG): container finished" podID="62e033b9-12bd-4de4-ba18-807beaca68db" containerID="4a26eb13a68fc86fca37ccadbc35bdf199a826d5b4a5034fe350778970631e25" exitCode=0 Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.860086 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.860147 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerName="nova-cell0-conductor-conductor" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.877519 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" event={"ID":"692c9c38-07d7-455f-8d9c-984904aef051","Type":"ContainerDied","Data":"37d13757387e0e0ed1cc4e1a877cea686e239b0e872288b067ae4fd7a59db191"} Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.878238 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6bjrl" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.883695 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.927494 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:25 crc kubenswrapper[4929]: E1002 11:34:25.927594 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data podName:dfb673e7-59bc-41d1-9bf0-d20527c4a740 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:26.927574966 +0000 UTC m=+1467.477941330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data") pod "rabbitmq-cell1-server-0" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.980029 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.174:8776/healthcheck\": read tcp 10.217.0.2:43942->10.217.0.174:8776: read: connection reset by peer" Oct 02 11:34:25 crc kubenswrapper[4929]: I1002 11:34:25.999829 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.005773 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.032601 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.032674 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:30.032656598 +0000 UTC m=+1470.583022962 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.044110 4929 scope.go:117] "RemoveContainer" containerID="45099a9a81d331acf15cd5d4cf4ab34cdedd4a4c511ece106065205a559fb3ec" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.046010 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.066067 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6bjrl"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.080353 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.096820 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.167025 4929 scope.go:117] "RemoveContainer" containerID="f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.215432 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142b5038-3c60-46fa-bcdf-97a9ae30f2c2" path="/var/lib/kubelet/pods/142b5038-3c60-46fa-bcdf-97a9ae30f2c2/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.217281 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20069136-d22e-4c10-b7c1-a4a735c4aaf3" path="/var/lib/kubelet/pods/20069136-d22e-4c10-b7c1-a4a735c4aaf3/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.217899 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd43d17-114a-4e7b-92ff-0d7b422aa645" path="/var/lib/kubelet/pods/2cd43d17-114a-4e7b-92ff-0d7b422aa645/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.225579 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.235584 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ae93dc-6dea-4651-b80a-896bf744ed05" path="/var/lib/kubelet/pods/35ae93dc-6dea-4651-b80a-896bf744ed05/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.236271 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444e3b63-3bc5-4324-8431-53991c7c7256" path="/var/lib/kubelet/pods/444e3b63-3bc5-4324-8431-53991c7c7256/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.236902 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4783768c-7664-4ff2-98a8-69e9d744462c" path="/var/lib/kubelet/pods/4783768c-7664-4ff2-98a8-69e9d744462c/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.238244 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d60065-8bbd-4182-be31-c0f851790792" path="/var/lib/kubelet/pods/49d60065-8bbd-4182-be31-c0f851790792/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.239032 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4490c9-7acf-4c3e-96d9-e88b6d777b8a" path="/var/lib/kubelet/pods/4b4490c9-7acf-4c3e-96d9-e88b6d777b8a/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.239613 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692c9c38-07d7-455f-8d9c-984904aef051" path="/var/lib/kubelet/pods/692c9c38-07d7-455f-8d9c-984904aef051/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.242740 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.245548 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b530bc3-0b1f-4607-9306-bba090124d3c" path="/var/lib/kubelet/pods/9b530bc3-0b1f-4607-9306-bba090124d3c/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.246586 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcbd13a-778f-42a3-872a-92a03324a4be" path="/var/lib/kubelet/pods/9fcbd13a-778f-42a3-872a-92a03324a4be/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.248659 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4d58654-fb5e-4caf-a555-0a6865d5337c" path="/var/lib/kubelet/pods/a4d58654-fb5e-4caf-a555-0a6865d5337c/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.249402 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c392507b-f994-40e8-aa01-63c09691bfa0" path="/var/lib/kubelet/pods/c392507b-f994-40e8-aa01-63c09691bfa0/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.270758 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3dc3d09-dbd9-4528-ab2f-17bb08d89d85" path="/var/lib/kubelet/pods/c3dc3d09-dbd9-4528-ab2f-17bb08d89d85/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.271300 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d193edfe-fe8e-43fb-a328-7018cd7ab38e" path="/var/lib/kubelet/pods/d193edfe-fe8e-43fb-a328-7018cd7ab38e/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.271901 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" path="/var/lib/kubelet/pods/d4a10ac0-e47f-47cf-9779-d60c30b14755/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.281929 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2" path="/var/lib/kubelet/pods/dbfca599-7e80-45c0-b3c0-2b3e7f89b6f2/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.282800 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" path="/var/lib/kubelet/pods/e8946c48-0a50-449c-b64a-e8e4ae2f84ba/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.290402 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf6eee5-d420-48f9-b5d7-e591fc2ba33c" path="/var/lib/kubelet/pods/faf6eee5-d420-48f9-b5d7-e591fc2ba33c/volumes" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.359889 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.397226 4929 scope.go:117] "RemoveContainer" containerID="f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.397385 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.397520 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.397605 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.404133 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6\": container with ID starting with f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6 not found: ID does not exist" containerID="f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.404179 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6"} err="failed to get container status \"f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6\": rpc error: code = NotFound desc = could not find container \"f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6\": container with ID starting with f2debff26cd09d8cb9b6f8e3e618551decd8f74eed418c3d5a9ac7e89b8372b6 not found: ID does not exist" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.404210 4929 scope.go:117] "RemoveContainer" containerID="31ba20b248cec7058dba917dcf36d5e4fb82da3a9e74b7f3c12e428f1868b6d2" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.409692 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.409752 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.619619 4929 scope.go:117] "RemoveContainer" containerID="0d706d9b8e206c73ee5edf3feec0f93d08bd03a9cabc5a8ca419652989c18773" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.666774 4929 scope.go:117] "RemoveContainer" containerID="aae3e40e81be663620ecc6be606854b80d924a650235106f6750681901686f12" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.688168 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.767054 4929 scope.go:117] "RemoveContainer" containerID="ca836523d4cd1ebebcad4ee242c555398051bedcf444894c067e5f3d2639700e" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789149 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789197 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789273 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789398 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789454 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789504 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789587 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46q5k\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.789628 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs\") pod \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\" (UID: \"23c56c4a-763f-4ce6-8b1f-d862662b16ec\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.803637 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.804307 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.811597 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k" (OuterVolumeSpecName: "kube-api-access-46q5k") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "kube-api-access-46q5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.819958 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.855303 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.944948 4929 generic.go:334] "Generic (PLEG): container finished" podID="39949247-a1b3-41bc-a94a-4c59049576cd" containerID="1369b86d88547970a8f877da10a92d101eafd5fa601f783fd300d42fabfef237" exitCode=0 Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.945048 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerDied","Data":"1369b86d88547970a8f877da10a92d101eafd5fa601f783fd300d42fabfef237"} Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.951535 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6cz\" (UniqueName: \"kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz\") pod \"e86f887d-db93-49c4-85ed-add5f01b25f7\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.953996 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs\") pod \"e86f887d-db93-49c4-85ed-add5f01b25f7\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.954138 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle\") pod \"e86f887d-db93-49c4-85ed-add5f01b25f7\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.954254 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs\") pod \"e86f887d-db93-49c4-85ed-add5f01b25f7\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.954414 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data\") pod \"e86f887d-db93-49c4-85ed-add5f01b25f7\" (UID: \"e86f887d-db93-49c4-85ed-add5f01b25f7\") " Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.955746 4929 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.956628 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.956720 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23c56c4a-763f-4ce6-8b1f-d862662b16ec-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.956810 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46q5k\" (UniqueName: \"kubernetes.io/projected/23c56c4a-763f-4ce6-8b1f-d862662b16ec-kube-api-access-46q5k\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.956524 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:26 crc kubenswrapper[4929]: E1002 11:34:26.957054 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data podName:dfb673e7-59bc-41d1-9bf0-d20527c4a740 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:28.957033358 +0000 UTC m=+1469.507399722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data") pod "rabbitmq-cell1-server-0" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.977880 4929 generic.go:334] "Generic (PLEG): container finished" podID="7d52b938-d877-46ba-b19c-7e6331422d01" containerID="472cf392fc1d08934faa0f282d9bc772f4855616312d90f4abf1debe7b4a8890" exitCode=0 Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.977999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9b10-account-delete-8lnrl" event={"ID":"7d52b938-d877-46ba-b19c-7e6331422d01","Type":"ContainerDied","Data":"472cf392fc1d08934faa0f282d9bc772f4855616312d90f4abf1debe7b4a8890"} Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.978031 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9b10-account-delete-8lnrl" event={"ID":"7d52b938-d877-46ba-b19c-7e6331422d01","Type":"ContainerStarted","Data":"50d151c3b37761a225850281594de023375cdd962b8526947b548b539c353258"} Oct 02 11:34:26 crc kubenswrapper[4929]: I1002 11:34:26.978547 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz" (OuterVolumeSpecName: "kube-api-access-pq6cz") pod "e86f887d-db93-49c4-85ed-add5f01b25f7" (UID: "e86f887d-db93-49c4-85ed-add5f01b25f7"). InnerVolumeSpecName "kube-api-access-pq6cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.008630 4929 generic.go:334] "Generic (PLEG): container finished" podID="e7503492-8d47-4852-aca4-0bb661665127" containerID="2ca8828c67e79e41d4ae3bdf92de19a250e65e7977515288ad6954c923d4fa65" exitCode=0 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.008859 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c11-account-delete-cw9t6" event={"ID":"e7503492-8d47-4852-aca4-0bb661665127","Type":"ContainerDied","Data":"2ca8828c67e79e41d4ae3bdf92de19a250e65e7977515288ad6954c923d4fa65"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.034147 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell091e2-account-delete-5d9qb" event={"ID":"45f95478-d16b-4ffb-9389-f68851cce4a6","Type":"ContainerStarted","Data":"93ff912760b169e4dd1793686766af4537a3c6326730aafbf893ef9dc1071707"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.063976 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq6cz\" (UniqueName: \"kubernetes.io/projected/e86f887d-db93-49c4-85ed-add5f01b25f7-kube-api-access-pq6cz\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.068785 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.080394 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b2b-account-delete-ghckr" event={"ID":"ae9c788d-5f22-443d-aa60-2f9e88dce9fd","Type":"ContainerStarted","Data":"dc7dcc0a747f65b16e6a03a780cc29f5adf362e38ab2c87519ed5de907c6ce29"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.080439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b2b-account-delete-ghckr" event={"ID":"ae9c788d-5f22-443d-aa60-2f9e88dce9fd","Type":"ContainerStarted","Data":"cc9620e8f068af3c2289866c70bf0a776ab9d669d30d45ca2cad2b9066c2db3c"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.090188 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1b9e6-account-delete-5mvft" event={"ID":"44d85c4b-9da3-40d5-a5c3-0aeac38eecee","Type":"ContainerStarted","Data":"526992c2364b4e9057ed6ed6417319db8e496bacabbda84ceb04da248da272e6"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.092578 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.099724 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea7fb-account-delete-pvxdd" event={"ID":"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f","Type":"ContainerStarted","Data":"2712b8756332b6295b655e11ab9f980ca701e1946c08758c38d253cc1e57c014"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.099796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea7fb-account-delete-pvxdd" event={"ID":"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f","Type":"ContainerStarted","Data":"6591c1e370e03752020624fe505bba47cd9184a25947b8ffb83603ef1ded1813"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.099947 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glancea7fb-account-delete-pvxdd" podUID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" containerName="mariadb-account-delete" containerID="cri-o://2712b8756332b6295b655e11ab9f980ca701e1946c08758c38d253cc1e57c014" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.117878 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron7b2b-account-delete-ghckr" podStartSLOduration=5.117858387 podStartE2EDuration="5.117858387s" podCreationTimestamp="2025-10-02 11:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:34:27.113597074 +0000 UTC m=+1467.663963438" watchObservedRunningTime="2025-10-02 11:34:27.117858387 +0000 UTC m=+1467.668224751" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.122000 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69566f664c-jps5x" event={"ID":"23c56c4a-763f-4ce6-8b1f-d862662b16ec","Type":"ContainerDied","Data":"c62d0fecc8e0afb7f30bafad49e36c7e7ef2d820bca5f6117bf9998ba82d0c1c"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.122070 4929 scope.go:117] "RemoveContainer" containerID="027e91ba08dcef685cce7a361a75e6c896ae57660ea313b7e71bbff0e07f1279" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.122120 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69566f664c-jps5x" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.142092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6c15-account-delete-bdxsh" event={"ID":"a18e7ab5-8994-4a34-98d2-0e65bbfc4068","Type":"ContainerStarted","Data":"9b38f23d86986aac0b6e3cdfaf784dd0e3723c3341bf10d16f821513ffafacff"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.142149 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6c15-account-delete-bdxsh" event={"ID":"a18e7ab5-8994-4a34-98d2-0e65bbfc4068","Type":"ContainerStarted","Data":"e555f5f55cc8a52231fc20c4c688aea477348ee5d194614421ec7c8cc2f08a2c"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.142287 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi6c15-account-delete-bdxsh" podUID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" containerName="mariadb-account-delete" containerID="cri-o://9b38f23d86986aac0b6e3cdfaf784dd0e3723c3341bf10d16f821513ffafacff" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.143345 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e86f887d-db93-49c4-85ed-add5f01b25f7" (UID: "e86f887d-db93-49c4-85ed-add5f01b25f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.164133 4929 generic.go:334] "Generic (PLEG): container finished" podID="0ba6072c-759c-4261-8107-8243d262003d" containerID="212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383" exitCode=0 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.164206 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba6072c-759c-4261-8107-8243d262003d","Type":"ContainerDied","Data":"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.164278 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.164885 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.164936 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165074 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165137 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle\") pod \"0ba6072c-759c-4261-8107-8243d262003d\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165144 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165179 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kgm7\" (UniqueName: \"kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165242 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts6kf\" (UniqueName: \"kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf\") pod \"0ba6072c-759c-4261-8107-8243d262003d\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165301 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165324 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165354 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165424 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165445 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data\") pod \"0ba6072c-759c-4261-8107-8243d262003d\" (UID: \"0ba6072c-759c-4261-8107-8243d262003d\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165479 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data\") pod \"39949247-a1b3-41bc-a94a-4c59049576cd\" (UID: \"39949247-a1b3-41bc-a94a-4c59049576cd\") " Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165930 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39949247-a1b3-41bc-a94a-4c59049576cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.165949 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.174395 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glancea7fb-account-delete-pvxdd" podStartSLOduration=5.174370456 podStartE2EDuration="5.174370456s" podCreationTimestamp="2025-10-02 11:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:34:27.159922732 +0000 UTC m=+1467.710289096" watchObservedRunningTime="2025-10-02 11:34:27.174370456 +0000 UTC m=+1467.724736840" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.176506 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7" (OuterVolumeSpecName: "kube-api-access-8kgm7") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "kube-api-access-8kgm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.178036 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs" (OuterVolumeSpecName: "logs") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.182465 4929 generic.go:334] "Generic (PLEG): container finished" podID="e86f887d-db93-49c4-85ed-add5f01b25f7" containerID="20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557" exitCode=0 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.182589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e86f887d-db93-49c4-85ed-add5f01b25f7","Type":"ContainerDied","Data":"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.183099 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e86f887d-db93-49c4-85ed-add5f01b25f7","Type":"ContainerDied","Data":"3bf652b94e541d9dc200f08945d546a3a641fb201871293a8fa04fcc430f1df3"} Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.182669 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.193024 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts" (OuterVolumeSpecName: "scripts") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.198924 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi6c15-account-delete-bdxsh" podStartSLOduration=4.198888559 podStartE2EDuration="4.198888559s" podCreationTimestamp="2025-10-02 11:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:34:27.188677196 +0000 UTC m=+1467.739043570" watchObservedRunningTime="2025-10-02 11:34:27.198888559 +0000 UTC m=+1467.749254923" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.203497 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf" (OuterVolumeSpecName: "kube-api-access-ts6kf") pod "0ba6072c-759c-4261-8107-8243d262003d" (UID: "0ba6072c-759c-4261-8107-8243d262003d"). InnerVolumeSpecName "kube-api-access-ts6kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.212691 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.222091 4929 scope.go:117] "RemoveContainer" containerID="a9694eb8911f9ff8551424e463b8bef808eae32090d923b999eba6998b0901c3" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.268251 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kgm7\" (UniqueName: \"kubernetes.io/projected/39949247-a1b3-41bc-a94a-4c59049576cd-kube-api-access-8kgm7\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.268283 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts6kf\" (UniqueName: \"kubernetes.io/projected/0ba6072c-759c-4261-8107-8243d262003d-kube-api-access-ts6kf\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.268295 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39949247-a1b3-41bc-a94a-4c59049576cd-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.268307 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.268318 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.269569 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data" (OuterVolumeSpecName: "config-data") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.281361 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "e86f887d-db93-49c4-85ed-add5f01b25f7" (UID: "e86f887d-db93-49c4-85ed-add5f01b25f7"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.302961 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data" (OuterVolumeSpecName: "config-data") pod "e86f887d-db93-49c4-85ed-add5f01b25f7" (UID: "e86f887d-db93-49c4-85ed-add5f01b25f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.322525 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.333545 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.380859 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data" (OuterVolumeSpecName: "config-data") pod "0ba6072c-759c-4261-8107-8243d262003d" (UID: "0ba6072c-759c-4261-8107-8243d262003d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382129 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382153 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382164 4929 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382175 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382183 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.382191 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.388848 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23c56c4a-763f-4ce6-8b1f-d862662b16ec" (UID: "23c56c4a-763f-4ce6-8b1f-d862662b16ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.397637 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="rabbitmq" containerID="cri-o://1300e80581b8037301a49cd07f0c5f8de41330fcc719f6803e48273136aa7404" gracePeriod=604800 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.479466 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.479789 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-central-agent" containerID="cri-o://83423c2654ca49651439a144e3eff0c4e3371ed929b89d62422a53c0a6be0dea" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.480308 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="proxy-httpd" containerID="cri-o://e5d8d0d67dce0c56c6f683c231f1e918df07bdc899e3c55e672d1e0c06c2472e" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.480374 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="sg-core" containerID="cri-o://c8b786a68a0810d547f648303a29ccea6d4efcfa31e794fa4cf6a27a57b61127" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.480429 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-notification-agent" containerID="cri-o://26cde39221ac8a2072a3fb8c38cbfe2e085b51f160c11eafe83d008ddf719bf8" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.480425 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ba6072c-759c-4261-8107-8243d262003d" (UID: "0ba6072c-759c-4261-8107-8243d262003d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.484293 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba6072c-759c-4261-8107-8243d262003d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.484379 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23c56c4a-763f-4ce6-8b1f-d862662b16ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.485202 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "e86f887d-db93-49c4-85ed-add5f01b25f7" (UID: "e86f887d-db93-49c4-85ed-add5f01b25f7"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.521149 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.532504 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.532703 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="51f77cee-69d5-4e5c-8707-a5be1914e351" containerName="kube-state-metrics" containerID="cri-o://2cae7ce5a315b3a1de86b3016d5d89f4784fd9aa5ec045ee66d7c930f75073c6" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.533425 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.588065 4929 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e86f887d-db93-49c4-85ed-add5f01b25f7-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.588095 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.588105 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.641987 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.675845 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.676350 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" containerName="memcached" containerID="cri-o://4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.690433 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.703234 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2vplt"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.720101 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2vplt"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.789743 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b2b-account-create-pjqbx"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.798312 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b2b-account-create-pjqbx"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.814194 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.832108 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kwgvk"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.838270 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data" (OuterVolumeSpecName: "config-data") pod "39949247-a1b3-41bc-a94a-4c59049576cd" (UID: "39949247-a1b3-41bc-a94a-4c59049576cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.842551 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kwgvk"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.848876 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6hhg9"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.875256 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6hhg9"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.899095 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39949247-a1b3-41bc-a94a-4c59049576cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.900867 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.901422 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7bd786b699-2sf9r" podUID="c89c2414-cee5-46e9-9284-cd96fb472fd7" containerName="keystone-api" containerID="cri-o://5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3" gracePeriod=30 Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.927852 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.954851 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5frdp"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.963167 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d58dd68b-qlcrz" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:35248->10.217.0.160:9311: read: connection reset by peer" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.963489 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d58dd68b-qlcrz" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:35256->10.217.0.160:9311: read: connection reset by peer" Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.968300 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cda6-account-create-r9q8t"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.983987 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5frdp"] Oct 02 11:34:27 crc kubenswrapper[4929]: I1002 11:34:27.986812 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cda6-account-create-r9q8t"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.125533 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="galera" containerID="cri-o://feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" gracePeriod=30 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.187169 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dec924-bb71-4ac6-ac46-d77725459692" path="/var/lib/kubelet/pods/40dec924-bb71-4ac6-ac46-d77725459692/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.190282 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55506978-e694-48ad-be63-ad2179c36f0f" path="/var/lib/kubelet/pods/55506978-e694-48ad-be63-ad2179c36f0f/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.192477 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44dff1a-4b7b-4bf0-aaab-26ed02bb1605" path="/var/lib/kubelet/pods/c44dff1a-4b7b-4bf0-aaab-26ed02bb1605/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.195473 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10e697f-2b74-44fd-9645-f996885417e5" path="/var/lib/kubelet/pods/e10e697f-2b74-44fd-9645-f996885417e5/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.196393 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7812df5-741a-4a01-b77d-6b80b4e71090" path="/var/lib/kubelet/pods/e7812df5-741a-4a01-b77d-6b80b4e71090/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.197482 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31e4bd6-d3e3-452e-ba3b-40892b9866e3" path="/var/lib/kubelet/pods/f31e4bd6-d3e3-452e-ba3b-40892b9866e3/volumes" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.215372 4929 generic.go:334] "Generic (PLEG): container finished" podID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerID="157276ef9d3a545d2f5ce4288c1bab5d100b24eab57de2c4e08c2b13bd82b387" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.215451 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerDied","Data":"157276ef9d3a545d2f5ce4288c1bab5d100b24eab57de2c4e08c2b13bd82b387"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.215479 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99d9d588b-ddwr8" event={"ID":"4b67fd7d-2814-4efd-ad06-ee8283104d49","Type":"ContainerDied","Data":"e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.215493 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38f439d0190cd896bc49946a7c6eeb1bc9d00e6dc845a7d0d81aab6ba4a60de" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.220192 4929 generic.go:334] "Generic (PLEG): container finished" podID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerID="41c95ee7304d226a10f65f9f4eb7f25c911fc6a7c2f8bd69eab49e70a8a3e99a" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.220271 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerDied","Data":"41c95ee7304d226a10f65f9f4eb7f25c911fc6a7c2f8bd69eab49e70a8a3e99a"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.229856 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican9b10-account-delete-8lnrl" event={"ID":"7d52b938-d877-46ba-b19c-7e6331422d01","Type":"ContainerDied","Data":"50d151c3b37761a225850281594de023375cdd962b8526947b548b539c353258"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.229900 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d151c3b37761a225850281594de023375cdd962b8526947b548b539c353258" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.232182 4929 generic.go:334] "Generic (PLEG): container finished" podID="51f77cee-69d5-4e5c-8707-a5be1914e351" containerID="2cae7ce5a315b3a1de86b3016d5d89f4784fd9aa5ec045ee66d7c930f75073c6" exitCode=2 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.232256 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51f77cee-69d5-4e5c-8707-a5be1914e351","Type":"ContainerDied","Data":"2cae7ce5a315b3a1de86b3016d5d89f4784fd9aa5ec045ee66d7c930f75073c6"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.254210 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba6072c-759c-4261-8107-8243d262003d","Type":"ContainerDied","Data":"e5ae839ebb89fab5d23c50ba7108f6172c2b628a0a54fcf3daf0de30f391e438"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.259093 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"39949247-a1b3-41bc-a94a-4c59049576cd","Type":"ContainerDied","Data":"67256d7f709b19a7c49467b7589a840b7969dd0bd524771819f830758a4e416d"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.259176 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.273622 4929 generic.go:334] "Generic (PLEG): container finished" podID="b07c8ee2-5443-410c-b2ab-b48699694626" containerID="2b859fd219d68a03c833c80a4486933cb925eae152050cfa50df66277e417160" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.273771 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerDied","Data":"2b859fd219d68a03c833c80a4486933cb925eae152050cfa50df66277e417160"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.301645 4929 generic.go:334] "Generic (PLEG): container finished" podID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerID="b99a17b7ec88652946955b5fdf985f5b9d3d8bd15ef24dfadcf98117eac94d02" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.301745 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerDied","Data":"b99a17b7ec88652946955b5fdf985f5b9d3d8bd15ef24dfadcf98117eac94d02"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.323184 4929 generic.go:334] "Generic (PLEG): container finished" podID="44d85c4b-9da3-40d5-a5c3-0aeac38eecee" containerID="759ab44b95a333eb21ce2ed2b607f3a10c0432cd2c8a3f26ea1c58203462e5b9" exitCode=1 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.323263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1b9e6-account-delete-5mvft" event={"ID":"44d85c4b-9da3-40d5-a5c3-0aeac38eecee","Type":"ContainerDied","Data":"759ab44b95a333eb21ce2ed2b607f3a10c0432cd2c8a3f26ea1c58203462e5b9"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.333306 4929 generic.go:334] "Generic (PLEG): container finished" podID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" containerID="9b38f23d86986aac0b6e3cdfaf784dd0e3723c3341bf10d16f821513ffafacff" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.333444 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6c15-account-delete-bdxsh" event={"ID":"a18e7ab5-8994-4a34-98d2-0e65bbfc4068","Type":"ContainerDied","Data":"9b38f23d86986aac0b6e3cdfaf784dd0e3723c3341bf10d16f821513ffafacff"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.333480 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi6c15-account-delete-bdxsh" event={"ID":"a18e7ab5-8994-4a34-98d2-0e65bbfc4068","Type":"ContainerDied","Data":"e555f5f55cc8a52231fc20c4c688aea477348ee5d194614421ec7c8cc2f08a2c"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.333520 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e555f5f55cc8a52231fc20c4c688aea477348ee5d194614421ec7c8cc2f08a2c" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.355702 4929 generic.go:334] "Generic (PLEG): container finished" podID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerID="e5d8d0d67dce0c56c6f683c231f1e918df07bdc899e3c55e672d1e0c06c2472e" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.355728 4929 generic.go:334] "Generic (PLEG): container finished" podID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerID="c8b786a68a0810d547f648303a29ccea6d4efcfa31e794fa4cf6a27a57b61127" exitCode=2 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.355767 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerDied","Data":"e5d8d0d67dce0c56c6f683c231f1e918df07bdc899e3c55e672d1e0c06c2472e"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.355787 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerDied","Data":"c8b786a68a0810d547f648303a29ccea6d4efcfa31e794fa4cf6a27a57b61127"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.380999 4929 generic.go:334] "Generic (PLEG): container finished" podID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerID="d65560d220b0508cdd18383acef57b7ee3e6f336bff1911b49d9df550f30b608" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.381120 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerDied","Data":"d65560d220b0508cdd18383acef57b7ee3e6f336bff1911b49d9df550f30b608"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.384780 4929 generic.go:334] "Generic (PLEG): container finished" podID="45f95478-d16b-4ffb-9389-f68851cce4a6" containerID="263c51355f740ae621964dc8410f6176b2dc8b9d938a5de67eddf2dd56884ff0" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.384866 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell091e2-account-delete-5d9qb" event={"ID":"45f95478-d16b-4ffb-9389-f68851cce4a6","Type":"ContainerDied","Data":"263c51355f740ae621964dc8410f6176b2dc8b9d938a5de67eddf2dd56884ff0"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.390878 4929 generic.go:334] "Generic (PLEG): container finished" podID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerID="e53b3ad5b5ce14f2519bfc0e1a58672bd568af0b97109869b7485930f064cca6" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.390961 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerDied","Data":"e53b3ad5b5ce14f2519bfc0e1a58672bd568af0b97109869b7485930f064cca6"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.414634 4929 generic.go:334] "Generic (PLEG): container finished" podID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" containerID="2712b8756332b6295b655e11ab9f980ca701e1946c08758c38d253cc1e57c014" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.414706 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea7fb-account-delete-pvxdd" event={"ID":"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f","Type":"ContainerDied","Data":"2712b8756332b6295b655e11ab9f980ca701e1946c08758c38d253cc1e57c014"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.414726 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancea7fb-account-delete-pvxdd" event={"ID":"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f","Type":"ContainerDied","Data":"6591c1e370e03752020624fe505bba47cd9184a25947b8ffb83603ef1ded1813"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.414736 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6591c1e370e03752020624fe505bba47cd9184a25947b8ffb83603ef1ded1813" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.447310 4929 generic.go:334] "Generic (PLEG): container finished" podID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerID="e0f273d8b045c5750b84ae82eb3de630e392ed10f8ecf765920cb992fe5bf07b" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.447380 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerDied","Data":"e0f273d8b045c5750b84ae82eb3de630e392ed10f8ecf765920cb992fe5bf07b"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.449697 4929 generic.go:334] "Generic (PLEG): container finished" podID="ae9c788d-5f22-443d-aa60-2f9e88dce9fd" containerID="dc7dcc0a747f65b16e6a03a780cc29f5adf362e38ab2c87519ed5de907c6ce29" exitCode=0 Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.450640 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b2b-account-delete-ghckr" event={"ID":"ae9c788d-5f22-443d-aa60-2f9e88dce9fd","Type":"ContainerDied","Data":"dc7dcc0a747f65b16e6a03a780cc29f5adf362e38ab2c87519ed5de907c6ce29"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.460017 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement5c11-account-delete-cw9t6" event={"ID":"e7503492-8d47-4852-aca4-0bb661665127","Type":"ContainerDied","Data":"6492ca5a8cfa08ed0e452342878ec82ca80db26bbbc63b72d6819758570fa397"} Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.460159 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6492ca5a8cfa08ed0e452342878ec82ca80db26bbbc63b72d6819758570fa397" Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.633636 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.648461 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.648540 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.648680 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.649420 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.658926 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.659026 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.686061 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.692114 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.692334 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.704198 4929 scope.go:117] "RemoveContainer" containerID="212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.704548 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.725190 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.742159 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.747013 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.748005 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.748056 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.749058 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.764649 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6n67\" (UniqueName: \"kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67\") pod \"e7503492-8d47-4852-aca4-0bb661665127\" (UID: \"e7503492-8d47-4852-aca4-0bb661665127\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.772499 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.779276 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.786461 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67" (OuterVolumeSpecName: "kube-api-access-w6n67") pod "e7503492-8d47-4852-aca4-0bb661665127" (UID: "e7503492-8d47-4852-aca4-0bb661665127"). InnerVolumeSpecName "kube-api-access-w6n67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.789017 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.800878 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8kqgz" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" probeResult="failure" output="command timed out" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.807924 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-69566f664c-jps5x"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.823875 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.830636 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.836313 4929 scope.go:117] "RemoveContainer" containerID="20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.865817 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.865891 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.865929 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9qsr\" (UniqueName: \"kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr\") pod \"a18e7ab5-8994-4a34-98d2-0e65bbfc4068\" (UID: \"a18e7ab5-8994-4a34-98d2-0e65bbfc4068\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.866948 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2kmp\" (UniqueName: \"kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868790 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868830 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868854 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868875 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868902 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868930 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868953 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7bzd\" (UniqueName: \"kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd\") pod \"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f\" (UID: \"df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.868996 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.869027 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghbgc\" (UniqueName: \"kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.869041 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.869063 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs\") pod \"4b67fd7d-2814-4efd-ad06-ee8283104d49\" (UID: \"4b67fd7d-2814-4efd-ad06-ee8283104d49\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.869082 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tts4z\" (UniqueName: \"kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z\") pod \"7d52b938-d877-46ba-b19c-7e6331422d01\" (UID: \"7d52b938-d877-46ba-b19c-7e6331422d01\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.869451 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6n67\" (UniqueName: \"kubernetes.io/projected/e7503492-8d47-4852-aca4-0bb661665127-kube-api-access-w6n67\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.871939 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr" (OuterVolumeSpecName: "kube-api-access-m9qsr") pod "a18e7ab5-8994-4a34-98d2-0e65bbfc4068" (UID: "a18e7ab5-8994-4a34-98d2-0e65bbfc4068"). InnerVolumeSpecName "kube-api-access-m9qsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.872279 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs" (OuterVolumeSpecName: "logs") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.874696 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd" (OuterVolumeSpecName: "kube-api-access-h7bzd") pod "df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" (UID: "df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f"). InnerVolumeSpecName "kube-api-access-h7bzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.875996 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z" (OuterVolumeSpecName: "kube-api-access-tts4z") pod "7d52b938-d877-46ba-b19c-7e6331422d01" (UID: "7d52b938-d877-46ba-b19c-7e6331422d01"). InnerVolumeSpecName "kube-api-access-tts4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.876041 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.876070 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs" (OuterVolumeSpecName: "logs") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.883919 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts" (OuterVolumeSpecName: "scripts") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.887459 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts" (OuterVolumeSpecName: "scripts") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.889182 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp" (OuterVolumeSpecName: "kube-api-access-d2kmp") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "kube-api-access-d2kmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.892599 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc" (OuterVolumeSpecName: "kube-api-access-ghbgc") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "kube-api-access-ghbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.913036 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8kqgz" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:34:28 crc kubenswrapper[4929]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 02 11:34:28 crc kubenswrapper[4929]: > Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970175 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970211 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970243 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970422 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g99h\" (UniqueName: \"kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970448 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom\") pod \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\" (UID: \"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970461 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970490 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970516 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data\") pod \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\" (UID: \"c8ebda2a-aee6-4eed-8333-5e96219fdcb3\") " Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970854 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970865 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7bzd\" (UniqueName: \"kubernetes.io/projected/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f-kube-api-access-h7bzd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970874 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970881 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghbgc\" (UniqueName: \"kubernetes.io/projected/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-kube-api-access-ghbgc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970889 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b67fd7d-2814-4efd-ad06-ee8283104d49-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970898 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tts4z\" (UniqueName: \"kubernetes.io/projected/7d52b938-d877-46ba-b19c-7e6331422d01-kube-api-access-tts4z\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970906 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9qsr\" (UniqueName: \"kubernetes.io/projected/a18e7ab5-8994-4a34-98d2-0e65bbfc4068-kube-api-access-m9qsr\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970914 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2kmp\" (UniqueName: \"kubernetes.io/projected/4b67fd7d-2814-4efd-ad06-ee8283104d49-kube-api-access-d2kmp\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970922 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.970929 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.971043 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:28 crc kubenswrapper[4929]: E1002 11:34:28.971093 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data podName:dfb673e7-59bc-41d1-9bf0-d20527c4a740 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:32.971075356 +0000 UTC m=+1473.521441720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data") pod "rabbitmq-cell1-server-0" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.971679 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.975010 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:28 crc kubenswrapper[4929]: I1002 11:34:28.976186 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts" (OuterVolumeSpecName: "scripts") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.029195 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.053689 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h" (OuterVolumeSpecName: "kube-api-access-8g99h") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "kube-api-access-8g99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.073873 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.073932 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.073946 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g99h\" (UniqueName: \"kubernetes.io/projected/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-kube-api-access-8g99h\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.073972 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.073983 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.081475 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.156246 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.159049 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.175297 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.184358 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.188135 4929 scope.go:117] "RemoveContainer" containerID="20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.191062 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 02 11:34:29 crc kubenswrapper[4929]: E1002 11:34:29.193938 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557\": container with ID starting with 20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557 not found: ID does not exist" containerID="20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.194028 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557"} err="failed to get container status \"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557\": rpc error: code = NotFound desc = could not find container \"20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557\": container with ID starting with 20d8e3702ec8200d0b30dfac3be2aac812b033b148eaa3e20dea33b252433557 not found: ID does not exist" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.194067 4929 scope.go:117] "RemoveContainer" containerID="212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383" Oct 02 11:34:29 crc kubenswrapper[4929]: E1002 11:34:29.194448 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383\": container with ID starting with 212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383 not found: ID does not exist" containerID="212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.194484 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383"} err="failed to get container status \"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383\": rpc error: code = NotFound desc = could not find container \"212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383\": container with ID starting with 212024a4805c969b49a9d7ea665b0079e6d9ff1777564b5519cdac764c6a7383 not found: ID does not exist" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.194512 4929 scope.go:117] "RemoveContainer" containerID="1369b86d88547970a8f877da10a92d101eafd5fa601f783fd300d42fabfef237" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.201018 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.205474 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data" (OuterVolumeSpecName: "config-data") pod "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" (UID: "ace60114-0dd0-4f94-aad6-b1c2ace2c9d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.224818 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data" (OuterVolumeSpecName: "config-data") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.233112 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.242004 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280592 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280712 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m88f6\" (UniqueName: \"kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280742 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280776 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280799 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280856 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280886 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle\") pod \"51f77cee-69d5-4e5c-8707-a5be1914e351\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280928 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.280978 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr5sm\" (UniqueName: \"kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm\") pod \"51f77cee-69d5-4e5c-8707-a5be1914e351\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281006 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281030 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281060 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281084 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fslgk\" (UniqueName: \"kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281271 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281325 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config\") pod \"51f77cee-69d5-4e5c-8707-a5be1914e351\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281371 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281402 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs\") pod \"51f77cee-69d5-4e5c-8707-a5be1914e351\" (UID: \"51f77cee-69d5-4e5c-8707-a5be1914e351\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281432 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data\") pod \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\" (UID: \"df0101ab-4fa3-4475-a685-fdd9ebb0ef68\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281505 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run\") pod \"a15b3dd7-69b2-480e-b61d-bba396447b88\" (UID: \"a15b3dd7-69b2-480e-b61d-bba396447b88\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281937 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281978 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.281993 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.282484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.286320 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs" (OuterVolumeSpecName: "logs") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.290381 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs" (OuterVolumeSpecName: "logs") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.294398 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm" (OuterVolumeSpecName: "kube-api-access-jr5sm") pod "51f77cee-69d5-4e5c-8707-a5be1914e351" (UID: "51f77cee-69d5-4e5c-8707-a5be1914e351"). InnerVolumeSpecName "kube-api-access-jr5sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.296225 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts" (OuterVolumeSpecName: "scripts") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.311879 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6" (OuterVolumeSpecName: "kube-api-access-m88f6") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "kube-api-access-m88f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.311896 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.322354 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk" (OuterVolumeSpecName: "kube-api-access-fslgk") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "kube-api-access-fslgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.322447 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.324714 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.335101 4929 scope.go:117] "RemoveContainer" containerID="00567a0bc33f8557e178e9e93912b92e11c4d3ee3b160960eaea914e74d1fdf9" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.347763 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.373787 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.374793 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.380067 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.382677 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.382757 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.382875 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.382926 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383005 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs\") pod \"f5faf6a4-6d67-4104-817f-422bdde6bf30\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383059 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383090 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle\") pod \"f5faf6a4-6d67-4104-817f-422bdde6bf30\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383221 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383250 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmq7n\" (UniqueName: \"kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383281 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data\") pod \"f5faf6a4-6d67-4104-817f-422bdde6bf30\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383304 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpsnz\" (UniqueName: \"kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz\") pod \"f5faf6a4-6d67-4104-817f-422bdde6bf30\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383334 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs\") pod \"f5faf6a4-6d67-4104-817f-422bdde6bf30\" (UID: \"f5faf6a4-6d67-4104-817f-422bdde6bf30\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383370 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383413 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts\") pod \"b07c8ee2-5443-410c-b2ab-b48699694626\" (UID: \"b07c8ee2-5443-410c-b2ab-b48699694626\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383856 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383878 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr5sm\" (UniqueName: \"kubernetes.io/projected/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-api-access-jr5sm\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383903 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383915 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fslgk\" (UniqueName: \"kubernetes.io/projected/a15b3dd7-69b2-480e-b61d-bba396447b88-kube-api-access-fslgk\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383929 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383939 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.383951 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a15b3dd7-69b2-480e-b61d-bba396447b88-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.384153 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.384169 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m88f6\" (UniqueName: \"kubernetes.io/projected/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-kube-api-access-m88f6\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.384182 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.389615 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.390093 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.391749 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.393018 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.393774 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs" (OuterVolumeSpecName: "logs") pod "f5faf6a4-6d67-4104-817f-422bdde6bf30" (UID: "f5faf6a4-6d67-4104-817f-422bdde6bf30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.395306 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data" (OuterVolumeSpecName: "config-data") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.400022 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.410972 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz" (OuterVolumeSpecName: "kube-api-access-bpsnz") pod "f5faf6a4-6d67-4104-817f-422bdde6bf30" (UID: "f5faf6a4-6d67-4104-817f-422bdde6bf30"). InnerVolumeSpecName "kube-api-access-bpsnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.411257 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets" (OuterVolumeSpecName: "secrets") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.416285 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n" (OuterVolumeSpecName: "kube-api-access-vmq7n") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "kube-api-access-vmq7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.420009 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c8ebda2a-aee6-4eed-8333-5e96219fdcb3" (UID: "c8ebda2a-aee6-4eed-8333-5e96219fdcb3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.468412 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.469312 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.477110 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488297 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488356 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6n44\" (UniqueName: \"kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44\") pod \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488396 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcx7v\" (UniqueName: \"kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v\") pod \"44d85c4b-9da3-40d5-a5c3-0aeac38eecee\" (UID: \"44d85c4b-9da3-40d5-a5c3-0aeac38eecee\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488437 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488454 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlgvp\" (UniqueName: \"kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp\") pod \"45f95478-d16b-4ffb-9389-f68851cce4a6\" (UID: \"45f95478-d16b-4ffb-9389-f68851cce4a6\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488475 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488507 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs\") pod \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488538 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config\") pod \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488559 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data\") pod \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488667 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle\") pod \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\" (UID: \"56c5fe9e-033d-4c3b-a71f-e2c215add4c5\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488696 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488733 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr8th\" (UniqueName: \"kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.488766 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs\") pod \"2fafc589-0041-44b2-a66b-93f4676c3cb1\" (UID: \"2fafc589-0041-44b2-a66b-93f4676c3cb1\") " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489099 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489114 4929 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489124 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmq7n\" (UniqueName: \"kubernetes.io/projected/b07c8ee2-5443-410c-b2ab-b48699694626-kube-api-access-vmq7n\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489133 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpsnz\" (UniqueName: \"kubernetes.io/projected/f5faf6a4-6d67-4104-817f-422bdde6bf30-kube-api-access-bpsnz\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489143 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5faf6a4-6d67-4104-817f-422bdde6bf30-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489159 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489167 4929 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489176 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489184 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489192 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489202 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ebda2a-aee6-4eed-8333-5e96219fdcb3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489210 4929 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489217 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b07c8ee2-5443-410c-b2ab-b48699694626-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.489225 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.500578 4929 generic.go:334] "Generic (PLEG): container finished" podID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerID="ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1" exitCode=0 Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.500662 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerDied","Data":"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.500725 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2fafc589-0041-44b2-a66b-93f4676c3cb1","Type":"ContainerDied","Data":"2b01a875553d4fc10378af9772d0e06b0e374db11e481f765d528a5b5b557c83"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.500744 4929 scope.go:117] "RemoveContainer" containerID="ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.501023 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.509090 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs" (OuterVolumeSpecName: "logs") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.513682 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "56c5fe9e-033d-4c3b-a71f-e2c215add4c5" (UID: "56c5fe9e-033d-4c3b-a71f-e2c215add4c5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.514332 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data" (OuterVolumeSpecName: "config-data") pod "56c5fe9e-033d-4c3b-a71f-e2c215add4c5" (UID: "56c5fe9e-033d-4c3b-a71f-e2c215add4c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.518414 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp" (OuterVolumeSpecName: "kube-api-access-hlgvp") pod "45f95478-d16b-4ffb-9389-f68851cce4a6" (UID: "45f95478-d16b-4ffb-9389-f68851cce4a6"). InnerVolumeSpecName "kube-api-access-hlgvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.521736 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.522661 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8ebda2a-aee6-4eed-8333-5e96219fdcb3","Type":"ContainerDied","Data":"9a173bc4afd6cc5b5d7e163fdc9b0e65bd13273638d86d4dad5442dfad5d3304"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.529308 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f77cee-69d5-4e5c-8707-a5be1914e351" (UID: "51f77cee-69d5-4e5c-8707-a5be1914e351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.530418 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th" (OuterVolumeSpecName: "kube-api-access-kr8th") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "kube-api-access-kr8th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.543266 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44" (OuterVolumeSpecName: "kube-api-access-h6n44") pod "56c5fe9e-033d-4c3b-a71f-e2c215add4c5" (UID: "56c5fe9e-033d-4c3b-a71f-e2c215add4c5"). InnerVolumeSpecName "kube-api-access-h6n44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.549257 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v" (OuterVolumeSpecName: "kube-api-access-qcx7v") pod "44d85c4b-9da3-40d5-a5c3-0aeac38eecee" (UID: "44d85c4b-9da3-40d5-a5c3-0aeac38eecee"). InnerVolumeSpecName "kube-api-access-qcx7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.549409 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a15b3dd7-69b2-480e-b61d-bba396447b88","Type":"ContainerDied","Data":"1389318952a4ecdc871220633fb534613019d433f527dfac6df8c882a2e3adce"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.549493 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.551230 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell091e2-account-delete-5d9qb" event={"ID":"45f95478-d16b-4ffb-9389-f68851cce4a6","Type":"ContainerDied","Data":"93ff912760b169e4dd1793686766af4537a3c6326730aafbf893ef9dc1071707"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.551325 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell091e2-account-delete-5d9qb" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.567740 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b07c8ee2-5443-410c-b2ab-b48699694626","Type":"ContainerDied","Data":"ec152cf5812d006e96ec0a494585fc49440422e421eb3a4d079d7093860d86a1"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.567755 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.577576 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5faf6a4-6d67-4104-817f-422bdde6bf30","Type":"ContainerDied","Data":"f05b53b0cabfeb9ac4a426b2f97a57b668cc6c38069d683b71c0b821afa7c510"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.577658 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.589004 4929 generic.go:334] "Generic (PLEG): container finished" podID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" containerID="4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c" exitCode=0 Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.589103 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.589653 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56c5fe9e-033d-4c3b-a71f-e2c215add4c5","Type":"ContainerDied","Data":"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.589675 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"56c5fe9e-033d-4c3b-a71f-e2c215add4c5","Type":"ContainerDied","Data":"275d09b21fbaf637c8ff4e1551f86439e4b28c0a701f2cf9546a5d93df1d005b"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.591739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d58dd68b-qlcrz" event={"ID":"df0101ab-4fa3-4475-a685-fdd9ebb0ef68","Type":"ContainerDied","Data":"fc31bdfb0f18066d6e9960af0c4a27af9f9f720f2a5b8977bc86d8c65246110e"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.595223 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d58dd68b-qlcrz" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.601135 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ace60114-0dd0-4f94-aad6-b1c2ace2c9d2","Type":"ContainerDied","Data":"71f533e042ae6efd7bf87043aecb7cb5be035b0c3ddb6ef83dbced3369f110e3"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.601221 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.601788 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603269 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr8th\" (UniqueName: \"kubernetes.io/projected/2fafc589-0041-44b2-a66b-93f4676c3cb1-kube-api-access-kr8th\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603296 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fafc589-0041-44b2-a66b-93f4676c3cb1-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603309 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6n44\" (UniqueName: \"kubernetes.io/projected/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kube-api-access-h6n44\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603322 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcx7v\" (UniqueName: \"kubernetes.io/projected/44d85c4b-9da3-40d5-a5c3-0aeac38eecee-kube-api-access-qcx7v\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603334 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlgvp\" (UniqueName: \"kubernetes.io/projected/45f95478-d16b-4ffb-9389-f68851cce4a6-kube-api-access-hlgvp\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603346 4929 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.603357 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.605781 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"51f77cee-69d5-4e5c-8707-a5be1914e351","Type":"ContainerDied","Data":"35d4f0c7ff12dfe5718821bd45e1f568401bdce14c220b92a3601e52685f7661"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.605858 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.608489 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1b9e6-account-delete-5mvft" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.608814 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1b9e6-account-delete-5mvft" event={"ID":"44d85c4b-9da3-40d5-a5c3-0aeac38eecee","Type":"ContainerDied","Data":"526992c2364b4e9057ed6ed6417319db8e496bacabbda84ceb04da248da272e6"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.611118 4929 generic.go:334] "Generic (PLEG): container finished" podID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerID="67d3645ec9cfed216d6036455755f8b22923aae7acb7d9366f616685db4f7af8" exitCode=0 Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.611220 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerDied","Data":"67d3645ec9cfed216d6036455755f8b22923aae7acb7d9366f616685db4f7af8"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.630392 4929 generic.go:334] "Generic (PLEG): container finished" podID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerID="83423c2654ca49651439a144e3eff0c4e3371ed929b89d62422a53c0a6be0dea" exitCode=0 Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.630691 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerDied","Data":"83423c2654ca49651439a144e3eff0c4e3371ed929b89d62422a53c0a6be0dea"} Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.630761 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancea7fb-account-delete-pvxdd" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.631675 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99d9d588b-ddwr8" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.631753 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement5c11-account-delete-cw9t6" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.631802 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi6c15-account-delete-bdxsh" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.631843 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican9b10-account-delete-8lnrl" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.636376 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.697751 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.699211 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.705325 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.705472 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.705636 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.718240 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data" (OuterVolumeSpecName: "config-data") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.718869 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data" (OuterVolumeSpecName: "config-data") pod "f5faf6a4-6d67-4104-817f-422bdde6bf30" (UID: "f5faf6a4-6d67-4104-817f-422bdde6bf30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.745436 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.745499 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.760779 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.789433 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c5fe9e-033d-4c3b-a71f-e2c215add4c5" (UID: "56c5fe9e-033d-4c3b-a71f-e2c215add4c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.801594 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "51f77cee-69d5-4e5c-8707-a5be1914e351" (UID: "51f77cee-69d5-4e5c-8707-a5be1914e351"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807736 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807764 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807774 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807782 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807791 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807801 4929 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.807810 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.874619 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data" (OuterVolumeSpecName: "config-data") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.901497 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.909770 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.910060 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.909800 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5faf6a4-6d67-4104-817f-422bdde6bf30" (UID: "f5faf6a4-6d67-4104-817f-422bdde6bf30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.915833 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.942650 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "56c5fe9e-033d-4c3b-a71f-e2c215add4c5" (UID: "56c5fe9e-033d-4c3b-a71f-e2c215add4c5"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.961362 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data" (OuterVolumeSpecName: "config-data") pod "a15b3dd7-69b2-480e-b61d-bba396447b88" (UID: "a15b3dd7-69b2-480e-b61d-bba396447b88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:29 crc kubenswrapper[4929]: I1002 11:34:29.987527 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b67fd7d-2814-4efd-ad06-ee8283104d49" (UID: "4b67fd7d-2814-4efd-ad06-ee8283104d49"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.003269 4929 scope.go:117] "RemoveContainer" containerID="a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.011369 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.011398 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a15b3dd7-69b2-480e-b61d-bba396447b88-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.011408 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.011417 4929 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c5fe9e-033d-4c3b-a71f-e2c215add4c5-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.011427 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b67fd7d-2814-4efd-ad06-ee8283104d49-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.013268 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2fafc589-0041-44b2-a66b-93f4676c3cb1" (UID: "2fafc589-0041-44b2-a66b-93f4676c3cb1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.014230 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.025845 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "51f77cee-69d5-4e5c-8707-a5be1914e351" (UID: "51f77cee-69d5-4e5c-8707-a5be1914e351"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.031265 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f5faf6a4-6d67-4104-817f-422bdde6bf30" (UID: "f5faf6a4-6d67-4104-817f-422bdde6bf30"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.079205 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b07c8ee2-5443-410c-b2ab-b48699694626" (UID: "b07c8ee2-5443-410c-b2ab-b48699694626"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.080744 4929 scope.go:117] "RemoveContainer" containerID="ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1" Oct 02 11:34:30 crc kubenswrapper[4929]: E1002 11:34:30.082102 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1\": container with ID starting with ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1 not found: ID does not exist" containerID="ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.082148 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1"} err="failed to get container status \"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1\": rpc error: code = NotFound desc = could not find container \"ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1\": container with ID starting with ff861a7bd257ee37c97df3cf791360546f17d3013cd9c36914066aab2d6e1da1 not found: ID does not exist" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.082183 4929 scope.go:117] "RemoveContainer" containerID="a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd" Oct 02 11:34:30 crc kubenswrapper[4929]: E1002 11:34:30.084339 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd\": container with ID starting with a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd not found: ID does not exist" containerID="a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.084367 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd"} err="failed to get container status \"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd\": rpc error: code = NotFound desc = could not find container \"a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd\": container with ID starting with a3f45aa76b2f8c04226f100d605badcece7e87345af2eed0bbf900c865b9a5dd not found: ID does not exist" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.084390 4929 scope.go:117] "RemoveContainer" containerID="d65560d220b0508cdd18383acef57b7ee3e6f336bff1911b49d9df550f30b608" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.086075 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "df0101ab-4fa3-4475-a685-fdd9ebb0ef68" (UID: "df0101ab-4fa3-4475-a685-fdd9ebb0ef68"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.131554 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle\") pod \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.131627 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs\") pod \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.131664 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data\") pod \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.131720 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jts64\" (UniqueName: \"kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64\") pod \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.131773 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom\") pod \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\" (UID: \"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.132096 4929 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/51f77cee-69d5-4e5c-8707-a5be1914e351-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.132113 4929 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5faf6a4-6d67-4104-817f-422bdde6bf30-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.132122 4929 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b07c8ee2-5443-410c-b2ab-b48699694626-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.132132 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fafc589-0041-44b2-a66b-93f4676c3cb1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.132140 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0101ab-4fa3-4475-a685-fdd9ebb0ef68-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: E1002 11:34:30.132220 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:30 crc kubenswrapper[4929]: E1002 11:34:30.132266 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:38.132248383 +0000 UTC m=+1478.682614747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.138514 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs" (OuterVolumeSpecName: "logs") pod "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" (UID: "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.156768 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.158980 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64" (OuterVolumeSpecName: "kube-api-access-jts64") pod "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" (UID: "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f"). InnerVolumeSpecName "kube-api-access-jts64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.161551 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" (UID: "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.185933 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba6072c-759c-4261-8107-8243d262003d" path="/var/lib/kubelet/pods/0ba6072c-759c-4261-8107-8243d262003d/volumes" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.192863 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" path="/var/lib/kubelet/pods/23c56c4a-763f-4ce6-8b1f-d862662b16ec/volumes" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.221315 4929 scope.go:117] "RemoveContainer" containerID="451546a934f38d915cbb04879f0147f97d390d02d97e32ed999e258f1445f92c" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.223015 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" (UID: "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.223860 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" path="/var/lib/kubelet/pods/39949247-a1b3-41bc-a94a-4c59049576cd/volumes" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.224587 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86f887d-db93-49c4-85ed-add5f01b25f7" path="/var/lib/kubelet/pods/e86f887d-db93-49c4-85ed-add5f01b25f7/volumes" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.236490 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jts64\" (UniqueName: \"kubernetes.io/projected/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-kube-api-access-jts64\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.236524 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.236533 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.236543 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.292099 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data" (OuterVolumeSpecName: "config-data") pod "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" (UID: "842a33bb-8f7e-468a-96de-cf4d2b4a1d3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.337483 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl\") pod \"ae9c788d-5f22-443d-aa60-2f9e88dce9fd\" (UID: \"ae9c788d-5f22-443d-aa60-2f9e88dce9fd\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.338132 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.342025 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl" (OuterVolumeSpecName: "kube-api-access-6kksl") pod "ae9c788d-5f22-443d-aa60-2f9e88dce9fd" (UID: "ae9c788d-5f22-443d-aa60-2f9e88dce9fd"). InnerVolumeSpecName "kube-api-access-6kksl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413342 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413629 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi6c15-account-delete-bdxsh"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413645 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413658 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement5c11-account-delete-cw9t6"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413668 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413680 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413689 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413701 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1b9e6-account-delete-5mvft"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413712 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413722 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican9b10-account-delete-8lnrl"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413733 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413743 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell091e2-account-delete-5d9qb"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413751 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413761 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413770 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413779 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancea7fb-account-delete-pvxdd"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413788 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413798 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413807 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413818 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413828 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.413839 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.438444 4929 scope.go:117] "RemoveContainer" containerID="41c95ee7304d226a10f65f9f4eb7f25c911fc6a7c2f8bd69eab49e70a8a3e99a" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.442158 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.444041 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/ae9c788d-5f22-443d-aa60-2f9e88dce9fd-kube-api-access-6kksl\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.478144 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.496780 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.546742 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle\") pod \"95ec6412-e313-4ed7-ae20-d531571b5be6\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.546841 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data\") pod \"95ec6412-e313-4ed7-ae20-d531571b5be6\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.546912 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h7g\" (UniqueName: \"kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g\") pod \"95ec6412-e313-4ed7-ae20-d531571b5be6\" (UID: \"95ec6412-e313-4ed7-ae20-d531571b5be6\") " Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.560916 4929 scope.go:117] "RemoveContainer" containerID="baa739f57ecb397b52caaf29fc37695b9a4448c825f0be5520586e3d2f8dccf3" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.561216 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g" (OuterVolumeSpecName: "kube-api-access-w4h7g") pod "95ec6412-e313-4ed7-ae20-d531571b5be6" (UID: "95ec6412-e313-4ed7-ae20-d531571b5be6"). InnerVolumeSpecName "kube-api-access-w4h7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.579603 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.600735 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ec6412-e313-4ed7-ae20-d531571b5be6" (UID: "95ec6412-e313-4ed7-ae20-d531571b5be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.604498 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-99d9d588b-ddwr8"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.620867 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data" (OuterVolumeSpecName: "config-data") pod "95ec6412-e313-4ed7-ae20-d531571b5be6" (UID: "95ec6412-e313-4ed7-ae20-d531571b5be6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.658194 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4h7g\" (UniqueName: \"kubernetes.io/projected/95ec6412-e313-4ed7-ae20-d531571b5be6-kube-api-access-w4h7g\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.658250 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.658264 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ec6412-e313-4ed7-ae20-d531571b5be6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.667427 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" event={"ID":"842a33bb-8f7e-468a-96de-cf4d2b4a1d3f","Type":"ContainerDied","Data":"d36ae9f5fc9ba60dce7832f3d0e4cb76f1e0f12d658a9e4badbd8655d11a8150"} Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.667510 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-655677957d-l5jzm" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.678400 4929 generic.go:334] "Generic (PLEG): container finished" podID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" exitCode=0 Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.678660 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95ec6412-e313-4ed7-ae20-d531571b5be6","Type":"ContainerDied","Data":"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67"} Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.678719 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95ec6412-e313-4ed7-ae20-d531571b5be6","Type":"ContainerDied","Data":"881cd7701ce6b1028e61b958f203d5939358ab820c6a13c4b8012035c875db16"} Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.678658 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.683165 4929 scope.go:117] "RemoveContainer" containerID="263c51355f740ae621964dc8410f6176b2dc8b9d938a5de67eddf2dd56884ff0" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.683463 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b2b-account-delete-ghckr" event={"ID":"ae9c788d-5f22-443d-aa60-2f9e88dce9fd","Type":"ContainerDied","Data":"cc9620e8f068af3c2289866c70bf0a776ab9d669d30d45ca2cad2b9066c2db3c"} Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.683558 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b2b-account-delete-ghckr" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.695871 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.731806 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56d58dd68b-qlcrz"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.733756 4929 scope.go:117] "RemoveContainer" containerID="2b859fd219d68a03c833c80a4486933cb925eae152050cfa50df66277e417160" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.751475 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.763521 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.767729 4929 scope.go:117] "RemoveContainer" containerID="0af63ab6c39474b0cbbc4d5e79bcc89441bf8fd2cd3ab19fdedfebef27a37122" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.776772 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.783784 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.790087 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.798109 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-655677957d-l5jzm"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.800916 4929 scope.go:117] "RemoveContainer" containerID="b99a17b7ec88652946955b5fdf985f5b9d3d8bd15ef24dfadcf98117eac94d02" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.803853 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.818450 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron7b2b-account-delete-ghckr"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.838189 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.844741 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.846797 4929 scope.go:117] "RemoveContainer" containerID="3153b1529e0c32bf3f30edef4acbd966c8aa5b2583cf6efe7dd0a6f5cab02ebd" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.871994 4929 scope.go:117] "RemoveContainer" containerID="4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.922190 4929 scope.go:117] "RemoveContainer" containerID="4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c" Oct 02 11:34:30 crc kubenswrapper[4929]: E1002 11:34:30.922619 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c\": container with ID starting with 4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c not found: ID does not exist" containerID="4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.922699 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c"} err="failed to get container status \"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c\": rpc error: code = NotFound desc = could not find container \"4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c\": container with ID starting with 4066cd2afbb751369c0842463e42e6a8f5725f93ef52604da7b3b685f0ea068c not found: ID does not exist" Oct 02 11:34:30 crc kubenswrapper[4929]: I1002 11:34:30.922732 4929 scope.go:117] "RemoveContainer" containerID="e53b3ad5b5ce14f2519bfc0e1a58672bd568af0b97109869b7485930f064cca6" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:30.997491 4929 scope.go:117] "RemoveContainer" containerID="21cd6c9af3eab7ce82f56ddfb8a37ffb6223a10844bf34a7a2c8ec24da313794" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.020191 4929 scope.go:117] "RemoveContainer" containerID="6b5c65593dc6d88e4e6e6322915fd952cd6ca9e932c894eb48b4bc84d07e1554" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.061712 4929 scope.go:117] "RemoveContainer" containerID="e0f273d8b045c5750b84ae82eb3de630e392ed10f8ecf765920cb992fe5bf07b" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.092908 4929 scope.go:117] "RemoveContainer" containerID="2cae7ce5a315b3a1de86b3016d5d89f4784fd9aa5ec045ee66d7c930f75073c6" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.113434 4929 scope.go:117] "RemoveContainer" containerID="759ab44b95a333eb21ce2ed2b607f3a10c0432cd2c8a3f26ea1c58203462e5b9" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.137037 4929 scope.go:117] "RemoveContainer" containerID="67d3645ec9cfed216d6036455755f8b22923aae7acb7d9366f616685db4f7af8" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.187465 4929 scope.go:117] "RemoveContainer" containerID="e3c00d90ab5c8fdbfb94fad352ef76e3b0dd878ba364460bbb331b6a693a2e07" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.190999 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 is running failed: container process not found" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.191872 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 is running failed: container process not found" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.192379 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 is running failed: container process not found" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.192467 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.244222 4929 scope.go:117] "RemoveContainer" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.278862 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.282364 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.284174 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.284240 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.284727 4929 scope.go:117] "RemoveContainer" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.285180 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67\": container with ID starting with 6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67 not found: ID does not exist" containerID="6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.285202 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67"} err="failed to get container status \"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67\": rpc error: code = NotFound desc = could not find container \"6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67\": container with ID starting with 6eeac430b40deece85428b9de9883600591caf3c89762baaf063a5b0736f9a67 not found: ID does not exist" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.285220 4929 scope.go:117] "RemoveContainer" containerID="dc7dcc0a747f65b16e6a03a780cc29f5adf362e38ab2c87519ed5de907c6ce29" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.343174 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.345099 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.346186 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.346218 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="galera" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.364497 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-69566f664c-jps5x" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.162:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.365023 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-69566f664c-jps5x" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.162:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.636849 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.645798 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.707883 4929 generic.go:334] "Generic (PLEG): container finished" podID="62e033b9-12bd-4de4-ba18-807beaca68db" containerID="52e15741d914815b2fb093a46215236fea49e8f8564b50718e5c10df7b9ff3e8" exitCode=0 Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.707937 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerDied","Data":"52e15741d914815b2fb093a46215236fea49e8f8564b50718e5c10df7b9ff3e8"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.715648 4929 generic.go:334] "Generic (PLEG): container finished" podID="b8b9fa36-f990-4cce-9544-23828715aa54" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" exitCode=0 Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.715681 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.715765 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8b9fa36-f990-4cce-9544-23828715aa54","Type":"ContainerDied","Data":"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.715835 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b8b9fa36-f990-4cce-9544-23828715aa54","Type":"ContainerDied","Data":"e10e2cca3042977bed86cea313d507d05c357b429279eaa46c4153299153c684"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.716037 4929 scope.go:117] "RemoveContainer" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.733863 4929 generic.go:334] "Generic (PLEG): container finished" podID="c89c2414-cee5-46e9-9284-cd96fb472fd7" containerID="5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3" exitCode=0 Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.733997 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bd786b699-2sf9r" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.734096 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd786b699-2sf9r" event={"ID":"c89c2414-cee5-46e9-9284-cd96fb472fd7","Type":"ContainerDied","Data":"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.734136 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bd786b699-2sf9r" event={"ID":"c89c2414-cee5-46e9-9284-cd96fb472fd7","Type":"ContainerDied","Data":"09ac7038040fd303886c5015104dd3d2ff0fd294f873fef0596991d6d8779ae3"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.737990 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_61e50682-8502-4570-916a-a3b90a5218e4/ovn-northd/0.log" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.738039 4929 generic.go:334] "Generic (PLEG): container finished" podID="61e50682-8502-4570-916a-a3b90a5218e4" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" exitCode=139 Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.738105 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerDied","Data":"c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a"} Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.753071 4929 scope.go:117] "RemoveContainer" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.753510 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412\": container with ID starting with 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 not found: ID does not exist" containerID="8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.753554 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412"} err="failed to get container status \"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412\": rpc error: code = NotFound desc = could not find container \"8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412\": container with ID starting with 8e3ebfea13b1ce6b9b2b90f9c5ab87c17ffb4f8d09b5172e58a49f3b5f111412 not found: ID does not exist" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.753583 4929 scope.go:117] "RemoveContainer" containerID="5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.775239 4929 scope.go:117] "RemoveContainer" containerID="5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.775739 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3\": container with ID starting with 5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3 not found: ID does not exist" containerID="5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.775792 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3"} err="failed to get container status \"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3\": rpc error: code = NotFound desc = could not find container \"5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3\": container with ID starting with 5af316d90d37c122d457ef1e51c2be281f38f1fa7a4dd566580eafeea18457a3 not found: ID does not exist" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777238 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777289 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777330 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777346 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777381 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle\") pod \"b8b9fa36-f990-4cce-9544-23828715aa54\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777408 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777424 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777505 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gztcl\" (UniqueName: \"kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777524 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs\") pod \"c89c2414-cee5-46e9-9284-cd96fb472fd7\" (UID: \"c89c2414-cee5-46e9-9284-cd96fb472fd7\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777551 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data\") pod \"b8b9fa36-f990-4cce-9544-23828715aa54\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.777565 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt9mg\" (UniqueName: \"kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg\") pod \"b8b9fa36-f990-4cce-9544-23828715aa54\" (UID: \"b8b9fa36-f990-4cce-9544-23828715aa54\") " Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.784673 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.785581 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl" (OuterVolumeSpecName: "kube-api-access-gztcl") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "kube-api-access-gztcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.785778 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg" (OuterVolumeSpecName: "kube-api-access-nt9mg") pod "b8b9fa36-f990-4cce-9544-23828715aa54" (UID: "b8b9fa36-f990-4cce-9544-23828715aa54"). InnerVolumeSpecName "kube-api-access-nt9mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.793277 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.797876 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts" (OuterVolumeSpecName: "scripts") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.825982 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data" (OuterVolumeSpecName: "config-data") pod "b8b9fa36-f990-4cce-9544-23828715aa54" (UID: "b8b9fa36-f990-4cce-9544-23828715aa54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.839550 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.863122 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8b9fa36-f990-4cce-9544-23828715aa54" (UID: "b8b9fa36-f990-4cce-9544-23828715aa54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.877463 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879179 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gztcl\" (UniqueName: \"kubernetes.io/projected/c89c2414-cee5-46e9-9284-cd96fb472fd7-kube-api-access-gztcl\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879206 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879221 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt9mg\" (UniqueName: \"kubernetes.io/projected/b8b9fa36-f990-4cce-9544-23828715aa54-kube-api-access-nt9mg\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879233 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879244 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879255 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879265 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b9fa36-f990-4cce-9544-23828715aa54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879276 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.879286 4929 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.891192 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data" (OuterVolumeSpecName: "config-data") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.899748 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c89c2414-cee5-46e9-9284-cd96fb472fd7" (UID: "c89c2414-cee5-46e9-9284-cd96fb472fd7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.909318 4929 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 02 11:34:31 crc kubenswrapper[4929]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-02T11:34:24Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 02 11:34:31 crc kubenswrapper[4929]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 02 11:34:31 crc kubenswrapper[4929]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-8kqgz" message=< Oct 02 11:34:31 crc kubenswrapper[4929]: Exiting ovn-controller (1) [FAILED] Oct 02 11:34:31 crc kubenswrapper[4929]: Killing ovn-controller (1) [ OK ] Oct 02 11:34:31 crc kubenswrapper[4929]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 02 11:34:31 crc kubenswrapper[4929]: 2025-10-02T11:34:24Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 02 11:34:31 crc kubenswrapper[4929]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 02 11:34:31 crc kubenswrapper[4929]: > Oct 02 11:34:31 crc kubenswrapper[4929]: E1002 11:34:31.909492 4929 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 02 11:34:31 crc kubenswrapper[4929]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-02T11:34:24Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 02 11:34:31 crc kubenswrapper[4929]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 02 11:34:31 crc kubenswrapper[4929]: > pod="openstack/ovn-controller-8kqgz" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" containerID="cri-o://fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.909530 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-8kqgz" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" containerID="cri-o://fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" gracePeriod=22 Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.982419 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:31 crc kubenswrapper[4929]: I1002 11:34:31.982452 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c89c2414-cee5-46e9-9284-cd96fb472fd7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.007577 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_61e50682-8502-4570-916a-a3b90a5218e4/ovn-northd/0.log" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.007651 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.084611 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.109458 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7bd786b699-2sf9r"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.126802 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.132877 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.173795 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" path="/var/lib/kubelet/pods/2fafc589-0041-44b2-a66b-93f4676c3cb1/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.174411 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d85c4b-9da3-40d5-a5c3-0aeac38eecee" path="/var/lib/kubelet/pods/44d85c4b-9da3-40d5-a5c3-0aeac38eecee/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.174945 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f95478-d16b-4ffb-9389-f68851cce4a6" path="/var/lib/kubelet/pods/45f95478-d16b-4ffb-9389-f68851cce4a6/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.176074 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" path="/var/lib/kubelet/pods/4b67fd7d-2814-4efd-ad06-ee8283104d49/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.176670 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f77cee-69d5-4e5c-8707-a5be1914e351" path="/var/lib/kubelet/pods/51f77cee-69d5-4e5c-8707-a5be1914e351/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.177251 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" path="/var/lib/kubelet/pods/56c5fe9e-033d-4c3b-a71f-e2c215add4c5/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.177740 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d52b938-d877-46ba-b19c-7e6331422d01" path="/var/lib/kubelet/pods/7d52b938-d877-46ba-b19c-7e6331422d01/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.180657 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" path="/var/lib/kubelet/pods/842a33bb-8f7e-468a-96de-cf4d2b4a1d3f/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.181280 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" path="/var/lib/kubelet/pods/95ec6412-e313-4ed7-ae20-d531571b5be6/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.182414 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" path="/var/lib/kubelet/pods/a15b3dd7-69b2-480e-b61d-bba396447b88/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.183237 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" path="/var/lib/kubelet/pods/a18e7ab5-8994-4a34-98d2-0e65bbfc4068/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.183744 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" path="/var/lib/kubelet/pods/ace60114-0dd0-4f94-aad6-b1c2ace2c9d2/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.186011 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9c788d-5f22-443d-aa60-2f9e88dce9fd" path="/var/lib/kubelet/pods/ae9c788d-5f22-443d-aa60-2f9e88dce9fd/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187059 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" path="/var/lib/kubelet/pods/b07c8ee2-5443-410c-b2ab-b48699694626/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187256 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187336 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187358 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187462 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psfwp\" (UniqueName: \"kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187505 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.187520 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs\") pod \"61e50682-8502-4570-916a-a3b90a5218e4\" (UID: \"61e50682-8502-4570-916a-a3b90a5218e4\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.188456 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts" (OuterVolumeSpecName: "scripts") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.188580 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" path="/var/lib/kubelet/pods/b8b9fa36-f990-4cce-9544-23828715aa54/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.189359 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89c2414-cee5-46e9-9284-cd96fb472fd7" path="/var/lib/kubelet/pods/c89c2414-cee5-46e9-9284-cd96fb472fd7/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.190350 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config" (OuterVolumeSpecName: "config") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.190496 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.190632 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" path="/var/lib/kubelet/pods/c8ebda2a-aee6-4eed-8333-5e96219fdcb3/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.194049 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" path="/var/lib/kubelet/pods/df0101ab-4fa3-4475-a685-fdd9ebb0ef68/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.194834 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" path="/var/lib/kubelet/pods/df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.195333 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7503492-8d47-4852-aca4-0bb661665127" path="/var/lib/kubelet/pods/e7503492-8d47-4852-aca4-0bb661665127/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.200654 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" path="/var/lib/kubelet/pods/f5faf6a4-6d67-4104-817f-422bdde6bf30/volumes" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.225080 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp" (OuterVolumeSpecName: "kube-api-access-psfwp") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "kube-api-access-psfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.282471 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.289555 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.289588 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.289597 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/61e50682-8502-4570-916a-a3b90a5218e4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.289606 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psfwp\" (UniqueName: \"kubernetes.io/projected/61e50682-8502-4570-916a-a3b90a5218e4-kube-api-access-psfwp\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.289616 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61e50682-8502-4570-916a-a3b90a5218e4-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.318673 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.324117 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "61e50682-8502-4570-916a-a3b90a5218e4" (UID: "61e50682-8502-4570-916a-a3b90a5218e4"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.392703 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.392737 4929 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e50682-8502-4570-916a-a3b90a5218e4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.398937 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.440818 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8kqgz_752197b6-8008-4699-895b-4cbf3d475e96/ovn-controller/0.log" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.440905 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.493805 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.493929 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.494013 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmsj\" (UniqueName: \"kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.494064 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.494128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.494154 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.494183 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config\") pod \"62e033b9-12bd-4de4-ba18-807beaca68db\" (UID: \"62e033b9-12bd-4de4-ba18-807beaca68db\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.497902 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.498354 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj" (OuterVolumeSpecName: "kube-api-access-5xmsj") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "kube-api-access-5xmsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.537635 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.539528 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config" (OuterVolumeSpecName: "config") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.539881 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.543453 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.561186 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "62e033b9-12bd-4de4-ba18-807beaca68db" (UID: "62e033b9-12bd-4de4-ba18-807beaca68db"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.595824 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596137 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596161 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsb4b\" (UniqueName: \"kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596190 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596270 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596284 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn\") pod \"752197b6-8008-4699-895b-4cbf3d475e96\" (UID: \"752197b6-8008-4699-895b-4cbf3d475e96\") " Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596313 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run" (OuterVolumeSpecName: "var-run") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596465 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596550 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596796 4929 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596809 4929 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596819 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596827 4929 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596836 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596844 4929 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596853 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596862 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmsj\" (UniqueName: \"kubernetes.io/projected/62e033b9-12bd-4de4-ba18-807beaca68db-kube-api-access-5xmsj\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596881 4929 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/752197b6-8008-4699-895b-4cbf3d475e96-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.596889 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e033b9-12bd-4de4-ba18-807beaca68db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.597432 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts" (OuterVolumeSpecName: "scripts") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.599517 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b" (OuterVolumeSpecName: "kube-api-access-bsb4b") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "kube-api-access-bsb4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.617950 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.659586 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "752197b6-8008-4699-895b-4cbf3d475e96" (UID: "752197b6-8008-4699-895b-4cbf3d475e96"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.698547 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.698586 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/752197b6-8008-4699-895b-4cbf3d475e96-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.698594 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/752197b6-8008-4699-895b-4cbf3d475e96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.698602 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsb4b\" (UniqueName: \"kubernetes.io/projected/752197b6-8008-4699-895b-4cbf3d475e96-kube-api-access-bsb4b\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.774745 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f54bbfbbc-rzbv9" event={"ID":"62e033b9-12bd-4de4-ba18-807beaca68db","Type":"ContainerDied","Data":"bda45a2f099cd2429f8ccbfa6ba0badb7ccc111062372af4cf272f5542413c30"} Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.774821 4929 scope.go:117] "RemoveContainer" containerID="4a26eb13a68fc86fca37ccadbc35bdf199a826d5b4a5034fe350778970631e25" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.775053 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f54bbfbbc-rzbv9" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.781092 4929 generic.go:334] "Generic (PLEG): container finished" podID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerID="26cde39221ac8a2072a3fb8c38cbfe2e085b51f160c11eafe83d008ddf719bf8" exitCode=0 Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.781405 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerDied","Data":"26cde39221ac8a2072a3fb8c38cbfe2e085b51f160c11eafe83d008ddf719bf8"} Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.787660 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8kqgz_752197b6-8008-4699-895b-4cbf3d475e96/ovn-controller/0.log" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.787700 4929 generic.go:334] "Generic (PLEG): container finished" podID="752197b6-8008-4699-895b-4cbf3d475e96" containerID="fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" exitCode=137 Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.787753 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz" event={"ID":"752197b6-8008-4699-895b-4cbf3d475e96","Type":"ContainerDied","Data":"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d"} Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.787774 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8kqgz" event={"ID":"752197b6-8008-4699-895b-4cbf3d475e96","Type":"ContainerDied","Data":"55f0d042af1930cc8212ac7acfbb68a154146277736cbbfb00842d288dca9c54"} Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.787773 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8kqgz" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.803672 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_61e50682-8502-4570-916a-a3b90a5218e4/ovn-northd/0.log" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.803745 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"61e50682-8502-4570-916a-a3b90a5218e4","Type":"ContainerDied","Data":"2c43719f35986168c0d7d320db2819d69012116b5e71c9255527e81b3f4584a1"} Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.803876 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.816569 4929 scope.go:117] "RemoveContainer" containerID="52e15741d914815b2fb093a46215236fea49e8f8564b50718e5c10df7b9ff3e8" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.827199 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.845784 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f54bbfbbc-rzbv9"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.848752 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.860116 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8kqgz"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.870524 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.870851 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.876447 4929 scope.go:117] "RemoveContainer" containerID="fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.905232 4929 scope.go:117] "RemoveContainer" containerID="fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" Oct 02 11:34:32 crc kubenswrapper[4929]: E1002 11:34:32.906143 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d\": container with ID starting with fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d not found: ID does not exist" containerID="fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.906184 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d"} err="failed to get container status \"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d\": rpc error: code = NotFound desc = could not find container \"fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d\": container with ID starting with fd16243999d5bc18b4d1f95481cfe464d99de9514ad0b6fae46f880f85689c1d not found: ID does not exist" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.906209 4929 scope.go:117] "RemoveContainer" containerID="a800d27ad9ba4905470d759a654c04cea37a9ca62559cf4a2feee8d6683bdd38" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.907135 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:34:32 crc kubenswrapper[4929]: I1002 11:34:32.939313 4929 scope.go:117] "RemoveContainer" containerID="c5e669a7d5fbc9122e13a3b2c52e0a13e7513c398d52f20d16601b7965aaac7a" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003085 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003197 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003663 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003828 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8n82\" (UniqueName: \"kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003889 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.003930 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.004020 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.004065 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.004105 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs\") pod \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\" (UID: \"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8\") " Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.004515 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.004588 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.004651 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data podName:dfb673e7-59bc-41d1-9bf0-d20527c4a740 nodeName:}" failed. No retries permitted until 2025-10-02 11:34:41.00463206 +0000 UTC m=+1481.554998424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data") pod "rabbitmq-cell1-server-0" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.005070 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.008554 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts" (OuterVolumeSpecName: "scripts") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.009297 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82" (OuterVolumeSpecName: "kube-api-access-n8n82") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "kube-api-access-n8n82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.041726 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.049726 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.072593 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.086009 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data" (OuterVolumeSpecName: "config-data") pod "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" (UID: "f090f547-93e3-4b7f-a3c8-8d97c8b2fca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106415 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106465 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8n82\" (UniqueName: \"kubernetes.io/projected/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-kube-api-access-n8n82\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106479 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106508 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106519 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106571 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.106582 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.598218 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.598824 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.599379 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.599419 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.600754 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.602218 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.603841 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:33 crc kubenswrapper[4929]: E1002 11:34:33.603904 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.828082 4929 generic.go:334] "Generic (PLEG): container finished" podID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerID="1300e80581b8037301a49cd07f0c5f8de41330fcc719f6803e48273136aa7404" exitCode=0 Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.828424 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerDied","Data":"1300e80581b8037301a49cd07f0c5f8de41330fcc719f6803e48273136aa7404"} Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.832242 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.832115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f090f547-93e3-4b7f-a3c8-8d97c8b2fca8","Type":"ContainerDied","Data":"94e3355a2acf155dec4f8266e43690cd87630399c3d1cbf5e7d488f92a2c1022"} Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.833230 4929 scope.go:117] "RemoveContainer" containerID="e5d8d0d67dce0c56c6f683c231f1e918df07bdc899e3c55e672d1e0c06c2472e" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.838238 4929 generic.go:334] "Generic (PLEG): container finished" podID="978200e0-025d-4000-baed-4ba85bf83c60" containerID="feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" exitCode=0 Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.838312 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerDied","Data":"feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d"} Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.863899 4929 scope.go:117] "RemoveContainer" containerID="c8b786a68a0810d547f648303a29ccea6d4efcfa31e794fa4cf6a27a57b61127" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.876746 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.882206 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.900272 4929 scope.go:117] "RemoveContainer" containerID="26cde39221ac8a2072a3fb8c38cbfe2e085b51f160c11eafe83d008ddf719bf8" Oct 02 11:34:33 crc kubenswrapper[4929]: I1002 11:34:33.927625 4929 scope.go:117] "RemoveContainer" containerID="83423c2654ca49651439a144e3eff0c4e3371ed929b89d62422a53c0a6be0dea" Oct 02 11:34:34 crc kubenswrapper[4929]: E1002 11:34:34.024098 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf090f547_93e3_4b7f_a3c8_8d97c8b2fca8.slice\": RecentStats: unable to find data in memory cache]" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.074017 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.087664 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.173238 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e50682-8502-4570-916a-a3b90a5218e4" path="/var/lib/kubelet/pods/61e50682-8502-4570-916a-a3b90a5218e4/volumes" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.173814 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" path="/var/lib/kubelet/pods/62e033b9-12bd-4de4-ba18-807beaca68db/volumes" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.174382 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752197b6-8008-4699-895b-4cbf3d475e96" path="/var/lib/kubelet/pods/752197b6-8008-4699-895b-4cbf3d475e96/volumes" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.175526 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" path="/var/lib/kubelet/pods/f090f547-93e3-4b7f-a3c8-8d97c8b2fca8/volumes" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.224935 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225019 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225049 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxf2\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225070 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225102 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225125 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225139 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225156 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225180 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmj8d\" (UniqueName: \"kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225737 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225768 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225864 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.225894 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226020 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226046 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226079 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226096 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226110 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226154 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data\") pod \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\" (UID: \"dfb673e7-59bc-41d1-9bf0-d20527c4a740\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226149 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226193 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated\") pod \"978200e0-025d-4000-baed-4ba85bf83c60\" (UID: \"978200e0-025d-4000-baed-4ba85bf83c60\") " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226447 4929 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.226789 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.227169 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.227728 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.227746 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.227808 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.228269 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.230615 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.231606 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2" (OuterVolumeSpecName: "kube-api-access-5nxf2") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "kube-api-access-5nxf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.232781 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.232860 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets" (OuterVolumeSpecName: "secrets") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.233067 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d" (OuterVolumeSpecName: "kube-api-access-vmj8d") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "kube-api-access-vmj8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.232899 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.234763 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info" (OuterVolumeSpecName: "pod-info") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.238584 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.256730 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data" (OuterVolumeSpecName: "config-data") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.260115 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.270292 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf" (OuterVolumeSpecName: "server-conf") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.275488 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "978200e0-025d-4000-baed-4ba85bf83c60" (UID: "978200e0-025d-4000-baed-4ba85bf83c60"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.311159 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dfb673e7-59bc-41d1-9bf0-d20527c4a740" (UID: "dfb673e7-59bc-41d1-9bf0-d20527c4a740"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328003 4929 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfb673e7-59bc-41d1-9bf0-d20527c4a740-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328301 4929 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328311 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nxf2\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-kube-api-access-5nxf2\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328339 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328348 4929 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfb673e7-59bc-41d1-9bf0-d20527c4a740-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328356 4929 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328364 4929 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328375 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmj8d\" (UniqueName: \"kubernetes.io/projected/978200e0-025d-4000-baed-4ba85bf83c60-kube-api-access-vmj8d\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328382 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328399 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328407 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328415 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328423 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfb673e7-59bc-41d1-9bf0-d20527c4a740-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328432 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978200e0-025d-4000-baed-4ba85bf83c60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328440 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328448 4929 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328456 4929 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/978200e0-025d-4000-baed-4ba85bf83c60-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328463 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfb673e7-59bc-41d1-9bf0-d20527c4a740-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.328471 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/978200e0-025d-4000-baed-4ba85bf83c60-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.345463 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.345813 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.430191 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.430223 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.854364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dfb673e7-59bc-41d1-9bf0-d20527c4a740","Type":"ContainerDied","Data":"54c0548cb2876ce82bba03d6ef6e8eaf0d8bb581208aa19783a534ab65ab4c5c"} Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.854414 4929 scope.go:117] "RemoveContainer" containerID="1300e80581b8037301a49cd07f0c5f8de41330fcc719f6803e48273136aa7404" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.854528 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.860922 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"978200e0-025d-4000-baed-4ba85bf83c60","Type":"ContainerDied","Data":"dd72fbe7680edf1cfb1f1f34ca1f15a94207241af7cce5c63433d5fe23113c0c"} Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.860993 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.876070 4929 generic.go:334] "Generic (PLEG): container finished" podID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerID="50ab6d4116dc1e4db95a4dd8529214c90a135250b4f2785bbf13989c07ed52bf" exitCode=0 Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.876115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerDied","Data":"50ab6d4116dc1e4db95a4dd8529214c90a135250b4f2785bbf13989c07ed52bf"} Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.891259 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.894947 4929 scope.go:117] "RemoveContainer" containerID="dcb01c0ec91fa8b636cd159dd6d4fbe9815deb68d2051731a33d12b7eda329bb" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.895736 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.931081 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.940494 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.981877 4929 scope.go:117] "RemoveContainer" containerID="feb794d1e3e6ffac48fda126a9f03eaf35b4f796d6bd4c0d594593490886709d" Oct 02 11:34:34 crc kubenswrapper[4929]: I1002 11:34:34.999231 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.010323 4929 scope.go:117] "RemoveContainer" containerID="9b31c710f5e16531b1e61137b047da65ed86c42222822c83a63e2d292b03a7f8" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.146557 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwt5j\" (UniqueName: \"kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j\") pod \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.146597 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data\") pod \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.146631 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle\") pod \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.146679 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs\") pod \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.146745 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom\") pod \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\" (UID: \"55fd721a-9a86-4aff-98ee-133ebd5c4f41\") " Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.147215 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs" (OuterVolumeSpecName: "logs") pod "55fd721a-9a86-4aff-98ee-133ebd5c4f41" (UID: "55fd721a-9a86-4aff-98ee-133ebd5c4f41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.150789 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "55fd721a-9a86-4aff-98ee-133ebd5c4f41" (UID: "55fd721a-9a86-4aff-98ee-133ebd5c4f41"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.152745 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j" (OuterVolumeSpecName: "kube-api-access-lwt5j") pod "55fd721a-9a86-4aff-98ee-133ebd5c4f41" (UID: "55fd721a-9a86-4aff-98ee-133ebd5c4f41"). InnerVolumeSpecName "kube-api-access-lwt5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.164735 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55fd721a-9a86-4aff-98ee-133ebd5c4f41" (UID: "55fd721a-9a86-4aff-98ee-133ebd5c4f41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.182893 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data" (OuterVolumeSpecName: "config-data") pod "55fd721a-9a86-4aff-98ee-133ebd5c4f41" (UID: "55fd721a-9a86-4aff-98ee-133ebd5c4f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.248247 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwt5j\" (UniqueName: \"kubernetes.io/projected/55fd721a-9a86-4aff-98ee-133ebd5c4f41-kube-api-access-lwt5j\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.248565 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.248577 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.248586 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fd721a-9a86-4aff-98ee-133ebd5c4f41-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.248594 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55fd721a-9a86-4aff-98ee-133ebd5c4f41-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.890991 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" event={"ID":"55fd721a-9a86-4aff-98ee-133ebd5c4f41","Type":"ContainerDied","Data":"75b30becb54620544c631af8adaf9f8ea7d83f419531de75d47384e21d4ffdf4"} Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.891057 4929 scope.go:117] "RemoveContainer" containerID="50ab6d4116dc1e4db95a4dd8529214c90a135250b4f2785bbf13989c07ed52bf" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.891094 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f9b7d8ff7-88gb5" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.916589 4929 scope.go:117] "RemoveContainer" containerID="241b3468cb9d97e9ba6f173143b44936f01809a56d73599a592eacd87d2efe4d" Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.942239 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:34:35 crc kubenswrapper[4929]: I1002 11:34:35.949049 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6f9b7d8ff7-88gb5"] Oct 02 11:34:36 crc kubenswrapper[4929]: I1002 11:34:36.168670 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" path="/var/lib/kubelet/pods/55fd721a-9a86-4aff-98ee-133ebd5c4f41/volumes" Oct 02 11:34:36 crc kubenswrapper[4929]: I1002 11:34:36.169727 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978200e0-025d-4000-baed-4ba85bf83c60" path="/var/lib/kubelet/pods/978200e0-025d-4000-baed-4ba85bf83c60/volumes" Oct 02 11:34:36 crc kubenswrapper[4929]: I1002 11:34:36.171164 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" path="/var/lib/kubelet/pods/dfb673e7-59bc-41d1-9bf0-d20527c4a740/volumes" Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.190032 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.191060 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:34:54.191038922 +0000 UTC m=+1494.741405286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.598118 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.598491 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.599042 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.599077 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.600246 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.602013 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.604863 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:38 crc kubenswrapper[4929]: E1002 11:34:38.604921 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.597195 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.598291 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.598733 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.598774 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.599512 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.601215 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.602588 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:43 crc kubenswrapper[4929]: E1002 11:34:43.602641 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.736866 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.737197 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.737247 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.737799 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.737859 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661" gracePeriod=600 Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.986992 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661" exitCode=0 Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.987039 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661"} Oct 02 11:34:44 crc kubenswrapper[4929]: I1002 11:34:44.987075 4929 scope.go:117] "RemoveContainer" containerID="d06bfb52896e631ee026cc068e1500959957fd07486c92bce6fd839653f6a217" Oct 02 11:34:45 crc kubenswrapper[4929]: I1002 11:34:45.996554 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0"} Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.597658 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.598739 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.598782 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.599385 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.599417 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.600015 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.601924 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:48 crc kubenswrapper[4929]: E1002 11:34:48.602047 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.597644 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.599156 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.600116 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.600697 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.600794 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.600996 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.603769 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:34:53 crc kubenswrapper[4929]: E1002 11:34:53.603830 4929 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-fv8ff" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.090816 4929 generic.go:334] "Generic (PLEG): container finished" podID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerID="ad49c21a672c805ec312d5fb5f9c9032867c22231864156a14347d73f9b26ac2" exitCode=137 Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.091033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"ad49c21a672c805ec312d5fb5f9c9032867c22231864156a14347d73f9b26ac2"} Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.180454 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.239548 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") pod \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.239606 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.239656 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock\") pod \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.239686 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh64w\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w\") pod \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.239754 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache\") pod \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\" (UID: \"4fca7cc0-4347-4fb0-99a2-5bdef9efd204\") " Oct 02 11:34:54 crc kubenswrapper[4929]: E1002 11:34:54.240111 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:34:54 crc kubenswrapper[4929]: E1002 11:34:54.240186 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:35:26.240149519 +0000 UTC m=+1526.790515893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.240321 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock" (OuterVolumeSpecName: "lock") pod "4fca7cc0-4347-4fb0-99a2-5bdef9efd204" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.240425 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache" (OuterVolumeSpecName: "cache") pod "4fca7cc0-4347-4fb0-99a2-5bdef9efd204" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.244468 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w" (OuterVolumeSpecName: "kube-api-access-nh64w") pod "4fca7cc0-4347-4fb0-99a2-5bdef9efd204" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204"). InnerVolumeSpecName "kube-api-access-nh64w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.244589 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4fca7cc0-4347-4fb0-99a2-5bdef9efd204" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.246848 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "4fca7cc0-4347-4fb0-99a2-5bdef9efd204" (UID: "4fca7cc0-4347-4fb0-99a2-5bdef9efd204"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.341116 4929 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.341191 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.341207 4929 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-lock\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.341220 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh64w\" (UniqueName: \"kubernetes.io/projected/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-kube-api-access-nh64w\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.341232 4929 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fca7cc0-4347-4fb0-99a2-5bdef9efd204-cache\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.359735 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.442445 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.919413 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fv8ff_0e942503-506b-4a11-aa8b-ca122be42fbb/ovs-vswitchd/0.log" Oct 02 11:34:54 crc kubenswrapper[4929]: I1002 11:34:54.921211 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060366 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060456 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060530 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060560 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czpkw\" (UniqueName: \"kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060577 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run" (OuterVolumeSpecName: "var-run") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060615 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib" (OuterVolumeSpecName: "var-lib") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060640 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060687 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log\") pod \"0e942503-506b-4a11-aa8b-ca122be42fbb\" (UID: \"0e942503-506b-4a11-aa8b-ca122be42fbb\") " Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060710 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.060804 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log" (OuterVolumeSpecName: "var-log") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.061038 4929 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.061055 4929 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.061066 4929 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.061076 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0e942503-506b-4a11-aa8b-ca122be42fbb-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.062148 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts" (OuterVolumeSpecName: "scripts") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.064427 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw" (OuterVolumeSpecName: "kube-api-access-czpkw") pod "0e942503-506b-4a11-aa8b-ca122be42fbb" (UID: "0e942503-506b-4a11-aa8b-ca122be42fbb"). InnerVolumeSpecName "kube-api-access-czpkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.101413 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fv8ff_0e942503-506b-4a11-aa8b-ca122be42fbb/ovs-vswitchd/0.log" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.102030 4929 generic.go:334] "Generic (PLEG): container finished" podID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" exitCode=137 Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.102081 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fv8ff" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.102102 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerDied","Data":"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b"} Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.102156 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fv8ff" event={"ID":"0e942503-506b-4a11-aa8b-ca122be42fbb","Type":"ContainerDied","Data":"ffd7c7ace908e9ba79b8e86f1630c30d209c74656698a0c176f90bf3cabe102d"} Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.102174 4929 scope.go:117] "RemoveContainer" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.110433 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4fca7cc0-4347-4fb0-99a2-5bdef9efd204","Type":"ContainerDied","Data":"ee15389ee8c608078322513cb85144936d51506916342c258b7db63e62bac1dd"} Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.110543 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.137404 4929 scope.go:117] "RemoveContainer" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.140078 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.144466 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-fv8ff"] Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.159172 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.162382 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e942503-506b-4a11-aa8b-ca122be42fbb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.162407 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czpkw\" (UniqueName: \"kubernetes.io/projected/0e942503-506b-4a11-aa8b-ca122be42fbb-kube-api-access-czpkw\") on node \"crc\" DevicePath \"\"" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.165554 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.167570 4929 scope.go:117] "RemoveContainer" containerID="5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.194421 4929 scope.go:117] "RemoveContainer" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" Oct 02 11:34:55 crc kubenswrapper[4929]: E1002 11:34:55.194912 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b\": container with ID starting with 23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b not found: ID does not exist" containerID="23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.194941 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b"} err="failed to get container status \"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b\": rpc error: code = NotFound desc = could not find container \"23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b\": container with ID starting with 23ed9b40113ebc30de47157a5fe4b3aff0e291ecc18bd81715215b3bf6c8532b not found: ID does not exist" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.194989 4929 scope.go:117] "RemoveContainer" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" Oct 02 11:34:55 crc kubenswrapper[4929]: E1002 11:34:55.195221 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0\": container with ID starting with 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 not found: ID does not exist" containerID="5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.195252 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0"} err="failed to get container status \"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0\": rpc error: code = NotFound desc = could not find container \"5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0\": container with ID starting with 5ade165aa2dea80f7447014761662dead4edf4ec0327529da85211be6037c0e0 not found: ID does not exist" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.195269 4929 scope.go:117] "RemoveContainer" containerID="5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a" Oct 02 11:34:55 crc kubenswrapper[4929]: E1002 11:34:55.195507 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a\": container with ID starting with 5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a not found: ID does not exist" containerID="5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.195532 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a"} err="failed to get container status \"5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a\": rpc error: code = NotFound desc = could not find container \"5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a\": container with ID starting with 5bdd0b49b8ddde321ee5a12cf4043a1ce81b4592f63c00e023b4c0f13be5e41a not found: ID does not exist" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.195547 4929 scope.go:117] "RemoveContainer" containerID="ad49c21a672c805ec312d5fb5f9c9032867c22231864156a14347d73f9b26ac2" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.215909 4929 scope.go:117] "RemoveContainer" containerID="d463cb431a68d8c8a6bc8838afb25b1c342f4bd84ce2023c1ef8358a6d79a0eb" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.237930 4929 scope.go:117] "RemoveContainer" containerID="1d45e6424e955430dffa5579cf4f5d18c47a7931b6e630ff334c44c39257c19c" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.255005 4929 scope.go:117] "RemoveContainer" containerID="e4b14c7b773820d673e84708764e3207f484e07e6581b05635251acbe436a01b" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.274661 4929 scope.go:117] "RemoveContainer" containerID="8e7f1a184638b273c379d892f5706e40b2b0e6e8a03f5d40e8cf8e31bb64e072" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.298555 4929 scope.go:117] "RemoveContainer" containerID="b60072b033a9f359be40981e533a243a7b09e82e55c795215b5c3ac05b529145" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.326467 4929 scope.go:117] "RemoveContainer" containerID="4c94d3591c6c3e15abde4c9e9bda1a1d6451806b1b6b0c671dfee4007ae1a8e3" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.350505 4929 scope.go:117] "RemoveContainer" containerID="50576db641077b7a1e9dbf23a9fc5b7cda23206b43329a6994ae96a7b01bca1b" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.369493 4929 scope.go:117] "RemoveContainer" containerID="7fc56f6f53a6276996cea4cb299fd663fe0652dc22f0c71496b40d33cbd4a999" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.390299 4929 scope.go:117] "RemoveContainer" containerID="d8d4bec396dfc299189ef1b9a62ea5ec2484a5fe0492556f6ab9d91d861c28eb" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.408992 4929 scope.go:117] "RemoveContainer" containerID="b84dbe7e64f60d151dd9bf83d9d60d85c9c02ab74df4e15e5302c38e6a6cc41c" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.423746 4929 scope.go:117] "RemoveContainer" containerID="8035c6c8bfb09505e74f61334bb079defd5376ea17a1624b860ad93bb160b4a7" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.439739 4929 scope.go:117] "RemoveContainer" containerID="cefdca76aa5689375d6555f161f627d6003d8fac34c8df5144f6cacb5ac6866c" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.484014 4929 scope.go:117] "RemoveContainer" containerID="908ca2fd69c5108943cc878199a4b51016780d034eeff8322db65f1600694e85" Oct 02 11:34:55 crc kubenswrapper[4929]: I1002 11:34:55.502878 4929 scope.go:117] "RemoveContainer" containerID="c422966ff4f7462ce3b91b168f5ac09e2c599380e12f670a2a65404aef3dd588" Oct 02 11:34:56 crc kubenswrapper[4929]: I1002 11:34:56.165624 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" path="/var/lib/kubelet/pods/0e942503-506b-4a11-aa8b-ca122be42fbb/volumes" Oct 02 11:34:56 crc kubenswrapper[4929]: I1002 11:34:56.166488 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" path="/var/lib/kubelet/pods/4fca7cc0-4347-4fb0-99a2-5bdef9efd204/volumes" Oct 02 11:35:02 crc kubenswrapper[4929]: I1002 11:35:02.756828 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.203:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.935710 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936500 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" containerName="memcached" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936517 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" containerName="memcached" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936534 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936542 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-server" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936559 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="probe" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936568 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="probe" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936577 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d85c4b-9da3-40d5-a5c3-0aeac38eecee" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936584 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d85c4b-9da3-40d5-a5c3-0aeac38eecee" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936619 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9c788d-5f22-443d-aa60-2f9e88dce9fd" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936628 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9c788d-5f22-443d-aa60-2f9e88dce9fd" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936643 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936651 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936667 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="ovsdbserver-nb" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936675 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="ovsdbserver-nb" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936688 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936697 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936709 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936717 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936727 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936735 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-server" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936744 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-notification-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936753 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-notification-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936763 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936770 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936782 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936789 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936801 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="rabbitmq" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936808 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="rabbitmq" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936819 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936826 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936838 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936846 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936854 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936862 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936872 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d60065-8bbd-4182-be31-c0f851790792" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936879 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d60065-8bbd-4182-be31-c0f851790792" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936896 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936903 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936915 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936922 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936931 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="rsync" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936939 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="rsync" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936951 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936975 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.936984 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-reaper" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.936990 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-reaper" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937001 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="ovsdbserver-sb" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937007 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="ovsdbserver-sb" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937018 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937025 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937035 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937041 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-server" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937052 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f887d-db93-49c4-85ed-add5f01b25f7" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937057 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f887d-db93-49c4-85ed-add5f01b25f7" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937065 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="mysql-bootstrap" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937071 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="mysql-bootstrap" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937081 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="setup-container" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937087 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="setup-container" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937095 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937100 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937111 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="init" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937116 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="init" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937126 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937132 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937139 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937145 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937151 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937157 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937166 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="dnsmasq-dns" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937172 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="dnsmasq-dns" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937181 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937187 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937198 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937203 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937212 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937217 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937224 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="sg-core" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937232 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="sg-core" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937239 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937246 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937252 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-metadata" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937258 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-metadata" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937267 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937274 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937282 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f77cee-69d5-4e5c-8707-a5be1914e351" containerName="kube-state-metrics" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937290 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f77cee-69d5-4e5c-8707-a5be1914e351" containerName="kube-state-metrics" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937305 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerName="nova-cell0-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937314 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerName="nova-cell0-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937325 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937331 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937339 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937344 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937353 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server-init" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937359 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server-init" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937367 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937374 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937384 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937391 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937405 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937412 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937421 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937428 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937436 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937442 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937454 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937460 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-server" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937471 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="mysql-bootstrap" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937478 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="mysql-bootstrap" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937486 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="swift-recon-cron" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937494 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="swift-recon-cron" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937503 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937511 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937520 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937525 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937533 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937540 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937547 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937553 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937561 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937566 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-log" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937574 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937579 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937588 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-expirer" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937593 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-expirer" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937601 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937607 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937615 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="cinder-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937620 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="cinder-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937629 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937634 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937643 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-central-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937649 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-central-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937657 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89c2414-cee5-46e9-9284-cd96fb472fd7" containerName="keystone-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937664 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89c2414-cee5-46e9-9284-cd96fb472fd7" containerName="keystone-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937676 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937684 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937693 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f95478-d16b-4ffb-9389-f68851cce4a6" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937699 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f95478-d16b-4ffb-9389-f68851cce4a6" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937711 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937718 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937734 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba6072c-759c-4261-8107-8243d262003d" containerName="nova-scheduler-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937745 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba6072c-759c-4261-8107-8243d262003d" containerName="nova-scheduler-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937758 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d52b938-d877-46ba-b19c-7e6331422d01" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937766 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d52b938-d877-46ba-b19c-7e6331422d01" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937773 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937780 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937787 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937793 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937804 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937809 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:35:17 crc kubenswrapper[4929]: E1002 11:35:17.937820 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7503492-8d47-4852-aca4-0bb661665127" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937828 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7503492-8d47-4852-aca4-0bb661665127" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.937998 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938007 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="ovsdbserver-nb" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938017 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-central-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938027 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938034 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86f887d-db93-49c4-85ed-add5f01b25f7" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938045 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9c788d-5f22-443d-aa60-2f9e88dce9fd" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938052 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938061 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="swift-recon-cron" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938069 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="39949247-a1b3-41bc-a94a-4c59049576cd" containerName="cinder-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938076 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938082 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89c2414-cee5-46e9-9284-cd96fb472fd7" containerName="keystone-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938091 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d60065-8bbd-4182-be31-c0f851790792" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938100 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938107 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="ovn-northd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938114 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-expirer" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938124 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovs-vswitchd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938133 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-metadata" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938140 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938147 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938155 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938164 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938172 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938179 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="rsync" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938187 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b67fd7d-2814-4efd-ad06-ee8283104d49" containerName="placement-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938195 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba6072c-759c-4261-8107-8243d262003d" containerName="nova-scheduler-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938202 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="cinder-scheduler" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938210 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938220 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18e7ab5-8994-4a34-98d2-0e65bbfc4068" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938228 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ebda2a-aee6-4eed-8333-5e96219fdcb3" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938235 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="692c9c38-07d7-455f-8d9c-984904aef051" containerName="dnsmasq-dns" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938241 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb673e7-59bc-41d1-9bf0-d20527c4a740" containerName="rabbitmq" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938250 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938259 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938265 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="sg-core" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938271 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7503492-8d47-4852-aca4-0bb661665127" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938279 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-updater" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938287 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace60114-0dd0-4f94-aad6-b1c2ace2c9d2" containerName="probe" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938297 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b9fa36-f990-4cce-9544-23828715aa54" containerName="nova-cell1-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938303 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fd721a-9a86-4aff-98ee-133ebd5c4f41" containerName="barbican-worker-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938311 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b07c8ee2-5443-410c-b2ab-b48699694626" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938317 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938326 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-reaper" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938333 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938339 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a10ac0-e47f-47cf-9779-d60c30b14755" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938347 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c5fe9e-033d-4c3b-a71f-e2c215add4c5" containerName="memcached" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938352 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e50682-8502-4570-916a-a3b90a5218e4" containerName="openstack-network-exporter" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938362 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="account-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938367 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f95478-d16b-4ffb-9389-f68851cce4a6" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938377 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e942503-506b-4a11-aa8b-ca122be42fbb" containerName="ovsdb-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938383 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5faf6a4-6d67-4104-817f-422bdde6bf30" containerName="nova-metadata-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938391 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0a2f2d-39b4-4f6f-acf7-e7fc0ddfd88f" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938399 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e033b9-12bd-4de4-ba18-807beaca68db" containerName="neutron-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938406 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fafc589-0041-44b2-a66b-93f4676c3cb1" containerName="nova-api-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938413 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api-log" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938419 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="752197b6-8008-4699-895b-4cbf3d475e96" containerName="ovn-controller" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938427 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="842a33bb-8f7e-468a-96de-cf4d2b4a1d3f" containerName="barbican-keystone-listener" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938435 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938442 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="978200e0-025d-4000-baed-4ba85bf83c60" containerName="galera" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938449 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938455 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c56c4a-763f-4ce6-8b1f-d862662b16ec" containerName="proxy-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938463 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15b3dd7-69b2-480e-b61d-bba396447b88" containerName="glance-httpd" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938472 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-auditor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938481 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f090f547-93e3-4b7f-a3c8-8d97c8b2fca8" containerName="ceilometer-notification-agent" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938489 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f77cee-69d5-4e5c-8707-a5be1914e351" containerName="kube-state-metrics" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938495 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d52b938-d877-46ba-b19c-7e6331422d01" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938502 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d85c4b-9da3-40d5-a5c3-0aeac38eecee" containerName="mariadb-account-delete" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938510 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="object-replicator" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938518 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fca7cc0-4347-4fb0-99a2-5bdef9efd204" containerName="container-server" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938525 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0101ab-4fa3-4475-a685-fdd9ebb0ef68" containerName="barbican-api" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938533 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ec6412-e313-4ed7-ae20-d531571b5be6" containerName="nova-cell0-conductor-conductor" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.938541 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8946c48-0a50-449c-b64a-e8e4ae2f84ba" containerName="ovsdbserver-sb" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.939716 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:17 crc kubenswrapper[4929]: I1002 11:35:17.947905 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.110272 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84fs7\" (UniqueName: \"kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.110396 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.110467 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.212218 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.212331 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.212406 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84fs7\" (UniqueName: \"kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.212882 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.213139 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.235268 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84fs7\" (UniqueName: \"kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7\") pod \"redhat-marketplace-tzpms\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.268843 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:18 crc kubenswrapper[4929]: I1002 11:35:18.709464 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:19 crc kubenswrapper[4929]: I1002 11:35:19.338636 4929 generic.go:334] "Generic (PLEG): container finished" podID="75547475-5d24-47c9-8a01-354946175616" containerID="0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9" exitCode=0 Oct 02 11:35:19 crc kubenswrapper[4929]: I1002 11:35:19.338703 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerDied","Data":"0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9"} Oct 02 11:35:19 crc kubenswrapper[4929]: I1002 11:35:19.338739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerStarted","Data":"0c9ffb872309e6fd53d89ceea9b0e719fe64dd29c2990aa46ee3880159495fe6"} Oct 02 11:35:20 crc kubenswrapper[4929]: I1002 11:35:20.347565 4929 generic.go:334] "Generic (PLEG): container finished" podID="75547475-5d24-47c9-8a01-354946175616" containerID="09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff" exitCode=0 Oct 02 11:35:20 crc kubenswrapper[4929]: I1002 11:35:20.347626 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerDied","Data":"09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff"} Oct 02 11:35:21 crc kubenswrapper[4929]: I1002 11:35:21.365883 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerStarted","Data":"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa"} Oct 02 11:35:21 crc kubenswrapper[4929]: I1002 11:35:21.388035 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzpms" podStartSLOduration=2.927980043 podStartE2EDuration="4.388019154s" podCreationTimestamp="2025-10-02 11:35:17 +0000 UTC" firstStartedPulling="2025-10-02 11:35:19.341064783 +0000 UTC m=+1519.891431157" lastFinishedPulling="2025-10-02 11:35:20.801103904 +0000 UTC m=+1521.351470268" observedRunningTime="2025-10-02 11:35:21.385493351 +0000 UTC m=+1521.935859765" watchObservedRunningTime="2025-10-02 11:35:21.388019154 +0000 UTC m=+1521.938385518" Oct 02 11:35:26 crc kubenswrapper[4929]: E1002 11:35:26.241527 4929 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:35:26 crc kubenswrapper[4929]: E1002 11:35:26.241873 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data podName:be704e8e-9b46-4dfb-9363-278e61720eaa nodeName:}" failed. No retries permitted until 2025-10-02 11:36:30.241854687 +0000 UTC m=+1590.792221061 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data") pod "rabbitmq-server-0" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa") : configmap "rabbitmq-config-data" not found Oct 02 11:35:27 crc kubenswrapper[4929]: E1002 11:35:27.635261 4929 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 02 11:35:27 crc kubenswrapper[4929]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-155-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-424-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack/rabbitmq-server-0" message=< Oct 02 11:35:27 crc kubenswrapper[4929]: Will put node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack into maintenance mode. The node will no longer serve any client traffic! Oct 02 11:35:27 crc kubenswrapper[4929]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-155-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-424-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: > Oct 02 11:35:27 crc kubenswrapper[4929]: E1002 11:35:27.636390 4929 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 02 11:35:27 crc kubenswrapper[4929]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-155-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Most common reasons for this are: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Oct 02 11:35:27 crc kubenswrapper[4929]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Oct 02 11:35:27 crc kubenswrapper[4929]: * Target node is not running Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: In addition to the diagnostics info below: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Oct 02 11:35:27 crc kubenswrapper[4929]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Oct 02 11:35:27 crc kubenswrapper[4929]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: DIAGNOSTICS Oct 02 11:35:27 crc kubenswrapper[4929]: =========== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Oct 02 11:35:27 crc kubenswrapper[4929]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: Current node details: Oct 02 11:35:27 crc kubenswrapper[4929]: * node name: 'rabbitmqcli-424-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Oct 02 11:35:27 crc kubenswrapper[4929]: * effective user's home directory: /var/lib/rabbitmq Oct 02 11:35:27 crc kubenswrapper[4929]: * Erlang cookie hash: 03OAAMzSf+tUpcNIbJlOTg== Oct 02 11:35:27 crc kubenswrapper[4929]: Oct 02 11:35:27 crc kubenswrapper[4929]: > pod="openstack/rabbitmq-server-0" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" containerID="cri-o://53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" Oct 02 11:35:27 crc kubenswrapper[4929]: I1002 11:35:27.636485 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" containerID="cri-o://53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" gracePeriod=604738 Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.269929 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.270166 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.330389 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.483540 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.567850 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:28 crc kubenswrapper[4929]: I1002 11:35:28.768304 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 02 11:35:30 crc kubenswrapper[4929]: I1002 11:35:30.444164 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzpms" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="registry-server" containerID="cri-o://e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa" gracePeriod=2 Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.363197 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.453006 4929 generic.go:334] "Generic (PLEG): container finished" podID="75547475-5d24-47c9-8a01-354946175616" containerID="e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa" exitCode=0 Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.453048 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerDied","Data":"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa"} Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.453075 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpms" event={"ID":"75547475-5d24-47c9-8a01-354946175616","Type":"ContainerDied","Data":"0c9ffb872309e6fd53d89ceea9b0e719fe64dd29c2990aa46ee3880159495fe6"} Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.453090 4929 scope.go:117] "RemoveContainer" containerID="e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.453097 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpms" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.514201 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84fs7\" (UniqueName: \"kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7\") pod \"75547475-5d24-47c9-8a01-354946175616\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.514277 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content\") pod \"75547475-5d24-47c9-8a01-354946175616\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.514419 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities\") pod \"75547475-5d24-47c9-8a01-354946175616\" (UID: \"75547475-5d24-47c9-8a01-354946175616\") " Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.515512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities" (OuterVolumeSpecName: "utilities") pod "75547475-5d24-47c9-8a01-354946175616" (UID: "75547475-5d24-47c9-8a01-354946175616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.520124 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7" (OuterVolumeSpecName: "kube-api-access-84fs7") pod "75547475-5d24-47c9-8a01-354946175616" (UID: "75547475-5d24-47c9-8a01-354946175616"). InnerVolumeSpecName "kube-api-access-84fs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.616016 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.616050 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84fs7\" (UniqueName: \"kubernetes.io/projected/75547475-5d24-47c9-8a01-354946175616-kube-api-access-84fs7\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.632299 4929 scope.go:117] "RemoveContainer" containerID="09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.649633 4929 scope.go:117] "RemoveContainer" containerID="0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.685328 4929 scope.go:117] "RemoveContainer" containerID="e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa" Oct 02 11:35:31 crc kubenswrapper[4929]: E1002 11:35:31.685899 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa\": container with ID starting with e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa not found: ID does not exist" containerID="e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.685952 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa"} err="failed to get container status \"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa\": rpc error: code = NotFound desc = could not find container \"e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa\": container with ID starting with e74740e1a667255795d0c7126a36c6108c375bd9fbf993d9ebc7026b0ee1d1fa not found: ID does not exist" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.686354 4929 scope.go:117] "RemoveContainer" containerID="09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff" Oct 02 11:35:31 crc kubenswrapper[4929]: E1002 11:35:31.686814 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff\": container with ID starting with 09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff not found: ID does not exist" containerID="09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.686852 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff"} err="failed to get container status \"09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff\": rpc error: code = NotFound desc = could not find container \"09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff\": container with ID starting with 09217c4f1714f83c689d1d07f32fec6811b27308bce26a660daa276724918eff not found: ID does not exist" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.686872 4929 scope.go:117] "RemoveContainer" containerID="0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9" Oct 02 11:35:31 crc kubenswrapper[4929]: E1002 11:35:31.687241 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9\": container with ID starting with 0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9 not found: ID does not exist" containerID="0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9" Oct 02 11:35:31 crc kubenswrapper[4929]: I1002 11:35:31.687301 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9"} err="failed to get container status \"0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9\": rpc error: code = NotFound desc = could not find container \"0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9\": container with ID starting with 0ae77f60b43622d00b2729b646ae7f5375204c4aa651c3f7eb6998bcbed737f9 not found: ID does not exist" Oct 02 11:35:32 crc kubenswrapper[4929]: I1002 11:35:32.373124 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75547475-5d24-47c9-8a01-354946175616" (UID: "75547475-5d24-47c9-8a01-354946175616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:35:32 crc kubenswrapper[4929]: I1002 11:35:32.427140 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75547475-5d24-47c9-8a01-354946175616-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:32 crc kubenswrapper[4929]: I1002 11:35:32.692092 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:32 crc kubenswrapper[4929]: I1002 11:35:32.698995 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpms"] Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.166351 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75547475-5d24-47c9-8a01-354946175616" path="/var/lib/kubelet/pods/75547475-5d24-47c9-8a01-354946175616/volumes" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.171134 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355322 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355385 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355440 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355480 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355524 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.355568 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356280 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356204 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356309 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356335 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356364 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356368 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356409 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqrnt\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt\") pod \"be704e8e-9b46-4dfb-9363-278e61720eaa\" (UID: \"be704e8e-9b46-4dfb-9363-278e61720eaa\") " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356772 4929 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.356783 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.357278 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.361026 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info" (OuterVolumeSpecName: "pod-info") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.361470 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt" (OuterVolumeSpecName: "kube-api-access-cqrnt") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "kube-api-access-cqrnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.361877 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.362049 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.364475 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.381256 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data" (OuterVolumeSpecName: "config-data") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.394298 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf" (OuterVolumeSpecName: "server-conf") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.451230 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "be704e8e-9b46-4dfb-9363-278e61720eaa" (UID: "be704e8e-9b46-4dfb-9363-278e61720eaa"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.457933 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458061 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458238 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be704e8e-9b46-4dfb-9363-278e61720eaa-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458340 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458633 4929 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be704e8e-9b46-4dfb-9363-278e61720eaa-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458680 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqrnt\" (UniqueName: \"kubernetes.io/projected/be704e8e-9b46-4dfb-9363-278e61720eaa-kube-api-access-cqrnt\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458696 4929 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be704e8e-9b46-4dfb-9363-278e61720eaa-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458723 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.458736 4929 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be704e8e-9b46-4dfb-9363-278e61720eaa-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.472286 4929 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.491133 4929 generic.go:334] "Generic (PLEG): container finished" podID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerID="53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" exitCode=0 Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.491181 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerDied","Data":"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0"} Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.491215 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.491228 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"be704e8e-9b46-4dfb-9363-278e61720eaa","Type":"ContainerDied","Data":"bf7aba24704181a1c38632ffbceff0aea2923bdada51a9ab0d797148ce5f7bb4"} Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.491244 4929 scope.go:117] "RemoveContainer" containerID="53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.520448 4929 scope.go:117] "RemoveContainer" containerID="d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.524779 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.529647 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.551196 4929 scope.go:117] "RemoveContainer" containerID="53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" Oct 02 11:35:34 crc kubenswrapper[4929]: E1002 11:35:34.551803 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0\": container with ID starting with 53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0 not found: ID does not exist" containerID="53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.551852 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0"} err="failed to get container status \"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0\": rpc error: code = NotFound desc = could not find container \"53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0\": container with ID starting with 53589009f68d44dcb13e1a9aa90c37e722ebd2ce65bea66f1d68e6beb9444bb0 not found: ID does not exist" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.551878 4929 scope.go:117] "RemoveContainer" containerID="d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919" Oct 02 11:35:34 crc kubenswrapper[4929]: E1002 11:35:34.552343 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919\": container with ID starting with d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919 not found: ID does not exist" containerID="d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.552392 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919"} err="failed to get container status \"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919\": rpc error: code = NotFound desc = could not find container \"d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919\": container with ID starting with d0647e0ebea3b6764b5b237a3c7e786831cd1e4ee8685723b81288439d49c919 not found: ID does not exist" Oct 02 11:35:34 crc kubenswrapper[4929]: I1002 11:35:34.559467 4929 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:35:36 crc kubenswrapper[4929]: I1002 11:35:36.165501 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" path="/var/lib/kubelet/pods/be704e8e-9b46-4dfb-9363-278e61720eaa/volumes" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.632420 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:35:48 crc kubenswrapper[4929]: E1002 11:35:48.633321 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="registry-server" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633337 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="registry-server" Oct 02 11:35:48 crc kubenswrapper[4929]: E1002 11:35:48.633352 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="extract-content" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633360 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="extract-content" Oct 02 11:35:48 crc kubenswrapper[4929]: E1002 11:35:48.633397 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633406 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" Oct 02 11:35:48 crc kubenswrapper[4929]: E1002 11:35:48.633427 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="setup-container" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633434 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="setup-container" Oct 02 11:35:48 crc kubenswrapper[4929]: E1002 11:35:48.633451 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="extract-utilities" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633462 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="extract-utilities" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633635 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="be704e8e-9b46-4dfb-9363-278e61720eaa" containerName="rabbitmq" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.633649 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="75547475-5d24-47c9-8a01-354946175616" containerName="registry-server" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.634687 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.637661 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.637722 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qng\" (UniqueName: \"kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.638000 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.653359 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.739692 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.739746 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.739790 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qng\" (UniqueName: \"kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.740518 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.740762 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.766269 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qng\" (UniqueName: \"kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng\") pod \"community-operators-8ls6j\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:48 crc kubenswrapper[4929]: I1002 11:35:48.954739 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:49 crc kubenswrapper[4929]: I1002 11:35:49.409018 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:35:49 crc kubenswrapper[4929]: I1002 11:35:49.604649 4929 generic.go:334] "Generic (PLEG): container finished" podID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerID="c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933" exitCode=0 Oct 02 11:35:49 crc kubenswrapper[4929]: I1002 11:35:49.604697 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerDied","Data":"c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933"} Oct 02 11:35:49 crc kubenswrapper[4929]: I1002 11:35:49.604752 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerStarted","Data":"db709270238a9d1f3bf4815372af2cab785824a9ae69ce6d4f184cd349227775"} Oct 02 11:35:51 crc kubenswrapper[4929]: I1002 11:35:51.621321 4929 generic.go:334] "Generic (PLEG): container finished" podID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerID="a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b" exitCode=0 Oct 02 11:35:51 crc kubenswrapper[4929]: I1002 11:35:51.621398 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerDied","Data":"a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b"} Oct 02 11:35:52 crc kubenswrapper[4929]: I1002 11:35:52.634283 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerStarted","Data":"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405"} Oct 02 11:35:52 crc kubenswrapper[4929]: I1002 11:35:52.652482 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ls6j" podStartSLOduration=2.159364596 podStartE2EDuration="4.652464055s" podCreationTimestamp="2025-10-02 11:35:48 +0000 UTC" firstStartedPulling="2025-10-02 11:35:49.606182784 +0000 UTC m=+1550.156549148" lastFinishedPulling="2025-10-02 11:35:52.099282243 +0000 UTC m=+1552.649648607" observedRunningTime="2025-10-02 11:35:52.649277663 +0000 UTC m=+1553.199644027" watchObservedRunningTime="2025-10-02 11:35:52.652464055 +0000 UTC m=+1553.202830419" Oct 02 11:35:58 crc kubenswrapper[4929]: I1002 11:35:58.955438 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:58 crc kubenswrapper[4929]: I1002 11:35:58.955900 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:58 crc kubenswrapper[4929]: I1002 11:35:58.991638 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:59 crc kubenswrapper[4929]: I1002 11:35:59.762994 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:35:59 crc kubenswrapper[4929]: I1002 11:35:59.814188 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:36:01 crc kubenswrapper[4929]: I1002 11:36:01.714731 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ls6j" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="registry-server" containerID="cri-o://4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405" gracePeriod=2 Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.117340 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.217087 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content\") pod \"7b829cce-1d28-4119-9a07-90f828f9eb86\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.217122 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities\") pod \"7b829cce-1d28-4119-9a07-90f828f9eb86\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.217151 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7qng\" (UniqueName: \"kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng\") pod \"7b829cce-1d28-4119-9a07-90f828f9eb86\" (UID: \"7b829cce-1d28-4119-9a07-90f828f9eb86\") " Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.218314 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities" (OuterVolumeSpecName: "utilities") pod "7b829cce-1d28-4119-9a07-90f828f9eb86" (UID: "7b829cce-1d28-4119-9a07-90f828f9eb86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.222159 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng" (OuterVolumeSpecName: "kube-api-access-q7qng") pod "7b829cce-1d28-4119-9a07-90f828f9eb86" (UID: "7b829cce-1d28-4119-9a07-90f828f9eb86"). InnerVolumeSpecName "kube-api-access-q7qng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.271930 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b829cce-1d28-4119-9a07-90f828f9eb86" (UID: "7b829cce-1d28-4119-9a07-90f828f9eb86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.319235 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.319363 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b829cce-1d28-4119-9a07-90f828f9eb86-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.319379 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7qng\" (UniqueName: \"kubernetes.io/projected/7b829cce-1d28-4119-9a07-90f828f9eb86-kube-api-access-q7qng\") on node \"crc\" DevicePath \"\"" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.725315 4929 generic.go:334] "Generic (PLEG): container finished" podID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerID="4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405" exitCode=0 Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.725359 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerDied","Data":"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405"} Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.725411 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ls6j" event={"ID":"7b829cce-1d28-4119-9a07-90f828f9eb86","Type":"ContainerDied","Data":"db709270238a9d1f3bf4815372af2cab785824a9ae69ce6d4f184cd349227775"} Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.725429 4929 scope.go:117] "RemoveContainer" containerID="4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.725604 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ls6j" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.776018 4929 scope.go:117] "RemoveContainer" containerID="a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.776421 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.783510 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ls6j"] Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.796511 4929 scope.go:117] "RemoveContainer" containerID="c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.821454 4929 scope.go:117] "RemoveContainer" containerID="4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405" Oct 02 11:36:02 crc kubenswrapper[4929]: E1002 11:36:02.821813 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405\": container with ID starting with 4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405 not found: ID does not exist" containerID="4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.821849 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405"} err="failed to get container status \"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405\": rpc error: code = NotFound desc = could not find container \"4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405\": container with ID starting with 4e6f6e322fd618cb52d1e53b87d72c5adde7ea56f33cf72d60914512e4e20405 not found: ID does not exist" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.821979 4929 scope.go:117] "RemoveContainer" containerID="a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b" Oct 02 11:36:02 crc kubenswrapper[4929]: E1002 11:36:02.822318 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b\": container with ID starting with a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b not found: ID does not exist" containerID="a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.822344 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b"} err="failed to get container status \"a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b\": rpc error: code = NotFound desc = could not find container \"a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b\": container with ID starting with a2a2aa6abd0582e5812b3c61a9a516e0b8f6dd9cdb2e02228b323e7a90ee0d9b not found: ID does not exist" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.822359 4929 scope.go:117] "RemoveContainer" containerID="c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933" Oct 02 11:36:02 crc kubenswrapper[4929]: E1002 11:36:02.822595 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933\": container with ID starting with c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933 not found: ID does not exist" containerID="c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933" Oct 02 11:36:02 crc kubenswrapper[4929]: I1002 11:36:02.822624 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933"} err="failed to get container status \"c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933\": rpc error: code = NotFound desc = could not find container \"c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933\": container with ID starting with c3de0045e04a90cac5dfbeba646823e69e8047738f4fa25979c4a5fe9f2d4933 not found: ID does not exist" Oct 02 11:36:04 crc kubenswrapper[4929]: I1002 11:36:04.164525 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" path="/var/lib/kubelet/pods/7b829cce-1d28-4119-9a07-90f828f9eb86/volumes" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.397628 4929 scope.go:117] "RemoveContainer" containerID="9dee8e121405f59c5f13c985beb42dd248f4368f1e1001f5114864300ae94b3d" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.421775 4929 scope.go:117] "RemoveContainer" containerID="545256c2e5a3d079c962d1cd6d366f7f7284a0b50af8728c42f20445f2a074fc" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.472513 4929 scope.go:117] "RemoveContainer" containerID="4443b5280736a17dd9f32e8de85fda28418ccddac29c408f2aa471462e13392a" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.510358 4929 scope.go:117] "RemoveContainer" containerID="34777fd28d6f0a388c6451ee50aae23f26532ca2ddbc18c61ad44db7497a0ee7" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.536981 4929 scope.go:117] "RemoveContainer" containerID="c87639312612f5b7e09b8aebb0316b6075f81445245717e51fda5751a3db56c8" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.560976 4929 scope.go:117] "RemoveContainer" containerID="bd2c5dbd1a7a392222669a7ca1331d0bc09ddb112ba5fed20e7f76db48ff7eed" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.591504 4929 scope.go:117] "RemoveContainer" containerID="563132632ec9af4b517dd0c98a94e892d44876a724b6c8146cd51c68809ccdb5" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.615942 4929 scope.go:117] "RemoveContainer" containerID="debb4348289999a8f3abaf03c4450711bafdb203a33eb5fe165ceb3841367bae" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.634337 4929 scope.go:117] "RemoveContainer" containerID="a54ad1e68998d5eeaffecd2ceb60e4ce78bfa102c02171343d152bc7337ef342" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.652791 4929 scope.go:117] "RemoveContainer" containerID="82486077057b1ee0fcab31077dd4225d2021103abf86b2dbd1c7cfe1ea382477" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.672239 4929 scope.go:117] "RemoveContainer" containerID="96f765af27e40428cb700ca16b6c24d9da1ada736c4753ceb2817710484416f8" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.698595 4929 scope.go:117] "RemoveContainer" containerID="7384489077dc8cd3293840bae9aadd9e62441ed7dbde530c3d8c89e74b1b0110" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.715738 4929 scope.go:117] "RemoveContainer" containerID="95f2131a5f9718323b706cfe761be1d34f7ebf9f5b078a46bf5ca95158bec8a3" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.746953 4929 scope.go:117] "RemoveContainer" containerID="58658af3353a3f18bc660e889c500b662caf020ac76f0cd05852a0df879fac0b" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.762746 4929 scope.go:117] "RemoveContainer" containerID="047acfe6971368179c8ba9cdbd0536c0529e14bdc217f4de7400dd6418593eae" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.782001 4929 scope.go:117] "RemoveContainer" containerID="1746635e6e6644889048b899323774c25ad20cec9c641616c1b701921c91a205" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.815544 4929 scope.go:117] "RemoveContainer" containerID="ed962c1b5d656d4e1215fbf1c48196a168d154f5fc0d6e13903f2e7ae10877e1" Oct 02 11:36:12 crc kubenswrapper[4929]: I1002 11:36:12.853765 4929 scope.go:117] "RemoveContainer" containerID="5d9df53a4f7dd6bf6680f6dc938e2a767b61229af9d876f885f266a730cf0a8a" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.161922 4929 scope.go:117] "RemoveContainer" containerID="fbb7ed5b03ef1c144f9d8326fca0faa2b3ea252a058e454c2f5d13b47dbb1af1" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.251161 4929 scope.go:117] "RemoveContainer" containerID="4c2c63d65925d087720cb093da6f11e1c0fd2badb401568b1904d09e45b554c8" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.283487 4929 scope.go:117] "RemoveContainer" containerID="28bd669c66fa526e77aca54c489ecfc6d4c91a615c3e8bf12eb5abf47179648c" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.323121 4929 scope.go:117] "RemoveContainer" containerID="bf1df801c33923c52b57d70ffbeb276141d6b88cfc7248fef720e8247b706bf1" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.343117 4929 scope.go:117] "RemoveContainer" containerID="7dc3a73ce0b7c90f23c4ff562b6e91fe7ea7b7babea5572654dcc3b8d39ab6c5" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.377545 4929 scope.go:117] "RemoveContainer" containerID="c3d54d33154267067894be6f6bf383c4dd38b0d264f89c53eca1cce53b8f5f42" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.407288 4929 scope.go:117] "RemoveContainer" containerID="157276ef9d3a545d2f5ce4288c1bab5d100b24eab57de2c4e08c2b13bd82b387" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.429026 4929 scope.go:117] "RemoveContainer" containerID="24faef81b6cede3d67c9462380194d13406ef6780e10159894ee6b2ef0ce25e9" Oct 02 11:37:13 crc kubenswrapper[4929]: I1002 11:37:13.458938 4929 scope.go:117] "RemoveContainer" containerID="33d493d22c989aac977ca56b30eaa4751bee9cf5bc3d357763c0d2fbfd6345c3" Oct 02 11:37:14 crc kubenswrapper[4929]: I1002 11:37:14.736598 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:37:14 crc kubenswrapper[4929]: I1002 11:37:14.736656 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:37:44 crc kubenswrapper[4929]: I1002 11:37:44.737076 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:37:44 crc kubenswrapper[4929]: I1002 11:37:44.737574 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.587482 4929 scope.go:117] "RemoveContainer" containerID="bb6ab76b4edcd3ece1c809ec2d716793e3d830f8bf38281a676ff06906c37bf9" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.614624 4929 scope.go:117] "RemoveContainer" containerID="2626d047aaf7ff7ef08925d134860a4d2f30e9567daf32e620de10d250ba21b3" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.641211 4929 scope.go:117] "RemoveContainer" containerID="c096e2aa711406812066af32f9e379cdc47cd54789bd6714fe77bfadcd569f8e" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.664660 4929 scope.go:117] "RemoveContainer" containerID="4be057dccda59231a48ae224c91812c2f935f4405234210c0ddc8c69220d1861" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.716521 4929 scope.go:117] "RemoveContainer" containerID="30e5a4ae31d94b664d18b31af6f668ff807fcdfc9dab241f49d3553f3335eafe" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.732329 4929 scope.go:117] "RemoveContainer" containerID="b4c23593ffd88427329daed4cd6385647bd0a34bbdda04cb685bb9c7e76335c3" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.749143 4929 scope.go:117] "RemoveContainer" containerID="cb914ad47f69429fb77bb7b326a163fa90f79b907935df45b1b8e560bf4455c2" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.764987 4929 scope.go:117] "RemoveContainer" containerID="63be97525a116a20d685df7b368cdf81587e9c84bb1a7ebc7feba61f32b3fd4a" Oct 02 11:38:13 crc kubenswrapper[4929]: I1002 11:38:13.788762 4929 scope.go:117] "RemoveContainer" containerID="33eb9bc027be9b469ed5574f73a2eb7bb9e751142cdb9f8eb6c7c3fbc32f3e27" Oct 02 11:38:14 crc kubenswrapper[4929]: I1002 11:38:14.736414 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:38:14 crc kubenswrapper[4929]: I1002 11:38:14.736465 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:38:14 crc kubenswrapper[4929]: I1002 11:38:14.736514 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:38:14 crc kubenswrapper[4929]: I1002 11:38:14.737790 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:38:14 crc kubenswrapper[4929]: I1002 11:38:14.737840 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" gracePeriod=600 Oct 02 11:38:14 crc kubenswrapper[4929]: E1002 11:38:14.863062 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:38:15 crc kubenswrapper[4929]: I1002 11:38:15.832007 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" exitCode=0 Oct 02 11:38:15 crc kubenswrapper[4929]: I1002 11:38:15.832065 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0"} Oct 02 11:38:15 crc kubenswrapper[4929]: I1002 11:38:15.832187 4929 scope.go:117] "RemoveContainer" containerID="85f08424ea0549c33e8adce5bf52a0ee3804dea4bc1b5c410a9b0fdc77644661" Oct 02 11:38:15 crc kubenswrapper[4929]: I1002 11:38:15.832660 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:38:15 crc kubenswrapper[4929]: E1002 11:38:15.832860 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:38:30 crc kubenswrapper[4929]: I1002 11:38:30.162679 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:38:30 crc kubenswrapper[4929]: E1002 11:38:30.163401 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:38:43 crc kubenswrapper[4929]: I1002 11:38:43.157120 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:38:43 crc kubenswrapper[4929]: E1002 11:38:43.159767 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:38:57 crc kubenswrapper[4929]: I1002 11:38:57.157188 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:38:57 crc kubenswrapper[4929]: E1002 11:38:57.157915 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:39:09 crc kubenswrapper[4929]: I1002 11:39:09.157615 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:39:09 crc kubenswrapper[4929]: E1002 11:39:09.158722 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:39:13 crc kubenswrapper[4929]: I1002 11:39:13.911224 4929 scope.go:117] "RemoveContainer" containerID="2ba25c8a47f17b1dfb7c707cfd669e6852019b20812604c78ad59f75e272e96d" Oct 02 11:39:13 crc kubenswrapper[4929]: I1002 11:39:13.937778 4929 scope.go:117] "RemoveContainer" containerID="c0eb3789f6d2a7ebb7bd842f108822fb6688be628ad39ba69601664e03c28d2a" Oct 02 11:39:13 crc kubenswrapper[4929]: I1002 11:39:13.971933 4929 scope.go:117] "RemoveContainer" containerID="acd44210c8619861169d2f06a85784af5612795c38734e9a5ca2e1173b57d742" Oct 02 11:39:14 crc kubenswrapper[4929]: I1002 11:39:14.014262 4929 scope.go:117] "RemoveContainer" containerID="bc25e8a77ed881d2d4de5ab990535881b586e9d4b601a6e9153ec40e5251d305" Oct 02 11:39:20 crc kubenswrapper[4929]: I1002 11:39:20.159869 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:39:20 crc kubenswrapper[4929]: E1002 11:39:20.160388 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:39:33 crc kubenswrapper[4929]: I1002 11:39:33.157684 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:39:33 crc kubenswrapper[4929]: E1002 11:39:33.159357 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:39:45 crc kubenswrapper[4929]: I1002 11:39:45.156611 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:39:45 crc kubenswrapper[4929]: E1002 11:39:45.157376 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:39:59 crc kubenswrapper[4929]: I1002 11:39:59.156977 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:39:59 crc kubenswrapper[4929]: E1002 11:39:59.157762 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:40:10 crc kubenswrapper[4929]: I1002 11:40:10.161631 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:40:10 crc kubenswrapper[4929]: E1002 11:40:10.162459 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.342503 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:22 crc kubenswrapper[4929]: E1002 11:40:22.344097 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="extract-content" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.344120 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="extract-content" Oct 02 11:40:22 crc kubenswrapper[4929]: E1002 11:40:22.344155 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="registry-server" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.344164 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="registry-server" Oct 02 11:40:22 crc kubenswrapper[4929]: E1002 11:40:22.344185 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="extract-utilities" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.344196 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="extract-utilities" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.344431 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b829cce-1d28-4119-9a07-90f828f9eb86" containerName="registry-server" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.346263 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.353734 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.442435 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.442525 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.442686 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddlg\" (UniqueName: \"kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.544565 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.544624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddlg\" (UniqueName: \"kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.544728 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.545159 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.545364 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.570162 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddlg\" (UniqueName: \"kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg\") pod \"certified-operators-nz7px\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:22 crc kubenswrapper[4929]: I1002 11:40:22.674174 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:23 crc kubenswrapper[4929]: I1002 11:40:23.144820 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:23 crc kubenswrapper[4929]: I1002 11:40:23.753008 4929 generic.go:334] "Generic (PLEG): container finished" podID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerID="1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c" exitCode=0 Oct 02 11:40:23 crc kubenswrapper[4929]: I1002 11:40:23.753088 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerDied","Data":"1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c"} Oct 02 11:40:23 crc kubenswrapper[4929]: I1002 11:40:23.753437 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerStarted","Data":"a6f873f84dc5a4aedb3eb770e066eb7536380284432d19aa15831352e2a2ef1e"} Oct 02 11:40:23 crc kubenswrapper[4929]: I1002 11:40:23.755311 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:40:25 crc kubenswrapper[4929]: I1002 11:40:25.156838 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:40:25 crc kubenswrapper[4929]: E1002 11:40:25.157317 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:40:25 crc kubenswrapper[4929]: I1002 11:40:25.776686 4929 generic.go:334] "Generic (PLEG): container finished" podID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerID="6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289" exitCode=0 Oct 02 11:40:25 crc kubenswrapper[4929]: I1002 11:40:25.776752 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerDied","Data":"6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289"} Oct 02 11:40:26 crc kubenswrapper[4929]: I1002 11:40:26.785054 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerStarted","Data":"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1"} Oct 02 11:40:26 crc kubenswrapper[4929]: I1002 11:40:26.807285 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nz7px" podStartSLOduration=2.267410046 podStartE2EDuration="4.807259889s" podCreationTimestamp="2025-10-02 11:40:22 +0000 UTC" firstStartedPulling="2025-10-02 11:40:23.755097294 +0000 UTC m=+1824.305463658" lastFinishedPulling="2025-10-02 11:40:26.294947137 +0000 UTC m=+1826.845313501" observedRunningTime="2025-10-02 11:40:26.801328466 +0000 UTC m=+1827.351694840" watchObservedRunningTime="2025-10-02 11:40:26.807259889 +0000 UTC m=+1827.357626253" Oct 02 11:40:32 crc kubenswrapper[4929]: I1002 11:40:32.674554 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:32 crc kubenswrapper[4929]: I1002 11:40:32.674855 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:32 crc kubenswrapper[4929]: I1002 11:40:32.718873 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:32 crc kubenswrapper[4929]: I1002 11:40:32.861729 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:32 crc kubenswrapper[4929]: I1002 11:40:32.949344 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:34 crc kubenswrapper[4929]: I1002 11:40:34.838547 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nz7px" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="registry-server" containerID="cri-o://7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1" gracePeriod=2 Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.267145 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.411510 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kddlg\" (UniqueName: \"kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg\") pod \"4ad8adff-0a2b-4ce1-af75-da8554565c03\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.411613 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content\") pod \"4ad8adff-0a2b-4ce1-af75-da8554565c03\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.411685 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities\") pod \"4ad8adff-0a2b-4ce1-af75-da8554565c03\" (UID: \"4ad8adff-0a2b-4ce1-af75-da8554565c03\") " Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.413250 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities" (OuterVolumeSpecName: "utilities") pod "4ad8adff-0a2b-4ce1-af75-da8554565c03" (UID: "4ad8adff-0a2b-4ce1-af75-da8554565c03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.417057 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg" (OuterVolumeSpecName: "kube-api-access-kddlg") pod "4ad8adff-0a2b-4ce1-af75-da8554565c03" (UID: "4ad8adff-0a2b-4ce1-af75-da8554565c03"). InnerVolumeSpecName "kube-api-access-kddlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.515058 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.515090 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kddlg\" (UniqueName: \"kubernetes.io/projected/4ad8adff-0a2b-4ce1-af75-da8554565c03-kube-api-access-kddlg\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.846993 4929 generic.go:334] "Generic (PLEG): container finished" podID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerID="7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1" exitCode=0 Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.847053 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerDied","Data":"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1"} Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.847386 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz7px" event={"ID":"4ad8adff-0a2b-4ce1-af75-da8554565c03","Type":"ContainerDied","Data":"a6f873f84dc5a4aedb3eb770e066eb7536380284432d19aa15831352e2a2ef1e"} Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.847404 4929 scope.go:117] "RemoveContainer" containerID="7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.847064 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz7px" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.873792 4929 scope.go:117] "RemoveContainer" containerID="6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.897236 4929 scope.go:117] "RemoveContainer" containerID="1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.925422 4929 scope.go:117] "RemoveContainer" containerID="7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1" Oct 02 11:40:35 crc kubenswrapper[4929]: E1002 11:40:35.926592 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1\": container with ID starting with 7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1 not found: ID does not exist" containerID="7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.926675 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1"} err="failed to get container status \"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1\": rpc error: code = NotFound desc = could not find container \"7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1\": container with ID starting with 7c9eea46ceac655aaabdddb58e81587cbb27fb3c022bb3eb63a532e82e138fd1 not found: ID does not exist" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.926706 4929 scope.go:117] "RemoveContainer" containerID="6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289" Oct 02 11:40:35 crc kubenswrapper[4929]: E1002 11:40:35.927377 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289\": container with ID starting with 6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289 not found: ID does not exist" containerID="6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.927429 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289"} err="failed to get container status \"6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289\": rpc error: code = NotFound desc = could not find container \"6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289\": container with ID starting with 6d3a8958cf2f5ac98b649440555a1951a979b2aaf4118650c031fe85950b2289 not found: ID does not exist" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.927460 4929 scope.go:117] "RemoveContainer" containerID="1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c" Oct 02 11:40:35 crc kubenswrapper[4929]: E1002 11:40:35.927844 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c\": container with ID starting with 1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c not found: ID does not exist" containerID="1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c" Oct 02 11:40:35 crc kubenswrapper[4929]: I1002 11:40:35.927915 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c"} err="failed to get container status \"1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c\": rpc error: code = NotFound desc = could not find container \"1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c\": container with ID starting with 1fecc96dee3cf3879ee7c83f1260f43b877e83b4dfe30e78241479f19b723a3c not found: ID does not exist" Oct 02 11:40:36 crc kubenswrapper[4929]: I1002 11:40:36.136558 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ad8adff-0a2b-4ce1-af75-da8554565c03" (UID: "4ad8adff-0a2b-4ce1-af75-da8554565c03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:40:36 crc kubenswrapper[4929]: I1002 11:40:36.188346 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:36 crc kubenswrapper[4929]: I1002 11:40:36.193568 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nz7px"] Oct 02 11:40:36 crc kubenswrapper[4929]: I1002 11:40:36.224604 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ad8adff-0a2b-4ce1-af75-da8554565c03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:40:38 crc kubenswrapper[4929]: I1002 11:40:38.166058 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" path="/var/lib/kubelet/pods/4ad8adff-0a2b-4ce1-af75-da8554565c03/volumes" Oct 02 11:40:40 crc kubenswrapper[4929]: I1002 11:40:40.160165 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:40:40 crc kubenswrapper[4929]: E1002 11:40:40.160395 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:40:51 crc kubenswrapper[4929]: I1002 11:40:51.155987 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:40:51 crc kubenswrapper[4929]: E1002 11:40:51.156782 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:41:05 crc kubenswrapper[4929]: I1002 11:41:05.157209 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:41:05 crc kubenswrapper[4929]: E1002 11:41:05.157920 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:41:14 crc kubenswrapper[4929]: I1002 11:41:14.118565 4929 scope.go:117] "RemoveContainer" containerID="9b38f23d86986aac0b6e3cdfaf784dd0e3723c3341bf10d16f821513ffafacff" Oct 02 11:41:14 crc kubenswrapper[4929]: I1002 11:41:14.141037 4929 scope.go:117] "RemoveContainer" containerID="2ca8828c67e79e41d4ae3bdf92de19a250e65e7977515288ad6954c923d4fa65" Oct 02 11:41:14 crc kubenswrapper[4929]: I1002 11:41:14.156337 4929 scope.go:117] "RemoveContainer" containerID="2712b8756332b6295b655e11ab9f980ca701e1946c08758c38d253cc1e57c014" Oct 02 11:41:14 crc kubenswrapper[4929]: I1002 11:41:14.197708 4929 scope.go:117] "RemoveContainer" containerID="472cf392fc1d08934faa0f282d9bc772f4855616312d90f4abf1debe7b4a8890" Oct 02 11:41:19 crc kubenswrapper[4929]: I1002 11:41:19.157061 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:41:19 crc kubenswrapper[4929]: E1002 11:41:19.157644 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:41:31 crc kubenswrapper[4929]: I1002 11:41:31.156420 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:41:31 crc kubenswrapper[4929]: E1002 11:41:31.157077 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:41:42 crc kubenswrapper[4929]: I1002 11:41:42.157175 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:41:42 crc kubenswrapper[4929]: E1002 11:41:42.157929 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:41:53 crc kubenswrapper[4929]: I1002 11:41:53.157003 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:41:53 crc kubenswrapper[4929]: E1002 11:41:53.157817 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:42:04 crc kubenswrapper[4929]: I1002 11:42:04.157312 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:42:04 crc kubenswrapper[4929]: E1002 11:42:04.158082 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:42:18 crc kubenswrapper[4929]: I1002 11:42:18.156627 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:42:18 crc kubenswrapper[4929]: E1002 11:42:18.158383 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:42:30 crc kubenswrapper[4929]: I1002 11:42:30.159737 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:42:30 crc kubenswrapper[4929]: E1002 11:42:30.160421 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:42:44 crc kubenswrapper[4929]: I1002 11:42:44.157004 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:42:44 crc kubenswrapper[4929]: E1002 11:42:44.157669 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:42:57 crc kubenswrapper[4929]: I1002 11:42:57.156841 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:42:57 crc kubenswrapper[4929]: E1002 11:42:57.157642 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:43:09 crc kubenswrapper[4929]: I1002 11:43:09.157837 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:43:09 crc kubenswrapper[4929]: E1002 11:43:09.158698 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:43:22 crc kubenswrapper[4929]: I1002 11:43:22.157582 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:43:23 crc kubenswrapper[4929]: I1002 11:43:23.023003 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165"} Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.143188 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq"] Oct 02 11:45:00 crc kubenswrapper[4929]: E1002 11:45:00.144105 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.144120 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4929]: E1002 11:45:00.144136 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.144144 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4929]: E1002 11:45:00.144170 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.144180 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.144394 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad8adff-0a2b-4ce1-af75-da8554565c03" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.144950 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.148847 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.149237 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.174102 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq"] Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.252502 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.252541 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbgw\" (UniqueName: \"kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.252570 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.353779 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.354910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbgw\" (UniqueName: \"kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.355137 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.356420 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.361756 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.372037 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbgw\" (UniqueName: \"kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw\") pod \"collect-profiles-29323425-mhgwq\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.471856 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:00 crc kubenswrapper[4929]: W1002 11:45:00.939101 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff74db9_c703_44d9_9bcf_bc6ef951a464.slice/crio-46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859 WatchSource:0}: Error finding container 46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859: Status 404 returned error can't find the container with id 46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859 Oct 02 11:45:00 crc kubenswrapper[4929]: I1002 11:45:00.940720 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq"] Oct 02 11:45:01 crc kubenswrapper[4929]: I1002 11:45:01.716399 4929 generic.go:334] "Generic (PLEG): container finished" podID="dff74db9-c703-44d9-9bcf-bc6ef951a464" containerID="2596cc18370ca8142983eb8609f5ad7eb686de837856952e3d5893619d47cb34" exitCode=0 Oct 02 11:45:01 crc kubenswrapper[4929]: I1002 11:45:01.716447 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" event={"ID":"dff74db9-c703-44d9-9bcf-bc6ef951a464","Type":"ContainerDied","Data":"2596cc18370ca8142983eb8609f5ad7eb686de837856952e3d5893619d47cb34"} Oct 02 11:45:01 crc kubenswrapper[4929]: I1002 11:45:01.716498 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" event={"ID":"dff74db9-c703-44d9-9bcf-bc6ef951a464","Type":"ContainerStarted","Data":"46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859"} Oct 02 11:45:02 crc kubenswrapper[4929]: I1002 11:45:02.983919 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.095602 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzbgw\" (UniqueName: \"kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw\") pod \"dff74db9-c703-44d9-9bcf-bc6ef951a464\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.095752 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume\") pod \"dff74db9-c703-44d9-9bcf-bc6ef951a464\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.095784 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume\") pod \"dff74db9-c703-44d9-9bcf-bc6ef951a464\" (UID: \"dff74db9-c703-44d9-9bcf-bc6ef951a464\") " Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.097110 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume" (OuterVolumeSpecName: "config-volume") pod "dff74db9-c703-44d9-9bcf-bc6ef951a464" (UID: "dff74db9-c703-44d9-9bcf-bc6ef951a464"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.100892 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw" (OuterVolumeSpecName: "kube-api-access-kzbgw") pod "dff74db9-c703-44d9-9bcf-bc6ef951a464" (UID: "dff74db9-c703-44d9-9bcf-bc6ef951a464"). InnerVolumeSpecName "kube-api-access-kzbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.101346 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dff74db9-c703-44d9-9bcf-bc6ef951a464" (UID: "dff74db9-c703-44d9-9bcf-bc6ef951a464"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.198138 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dff74db9-c703-44d9-9bcf-bc6ef951a464-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.198186 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dff74db9-c703-44d9-9bcf-bc6ef951a464-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.198202 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzbgw\" (UniqueName: \"kubernetes.io/projected/dff74db9-c703-44d9-9bcf-bc6ef951a464-kube-api-access-kzbgw\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.730855 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" event={"ID":"dff74db9-c703-44d9-9bcf-bc6ef951a464","Type":"ContainerDied","Data":"46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859"} Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.730901 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46773c0b67fc93e389337267652c19307a58c08bdd6dcd89d7a631a6ce253859" Oct 02 11:45:03 crc kubenswrapper[4929]: I1002 11:45:03.730915 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq" Oct 02 11:45:04 crc kubenswrapper[4929]: I1002 11:45:04.051698 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb"] Oct 02 11:45:04 crc kubenswrapper[4929]: I1002 11:45:04.056643 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-4xtkb"] Oct 02 11:45:04 crc kubenswrapper[4929]: I1002 11:45:04.171710 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc152222-074c-4883-bb5f-f0a836a96023" path="/var/lib/kubelet/pods/cc152222-074c-4883-bb5f-f0a836a96023/volumes" Oct 02 11:45:14 crc kubenswrapper[4929]: I1002 11:45:14.331454 4929 scope.go:117] "RemoveContainer" containerID="27335184540d8540f6619896479f10caf0123a6806776241d36676eddf8e3ef9" Oct 02 11:45:44 crc kubenswrapper[4929]: I1002 11:45:44.736780 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:45:44 crc kubenswrapper[4929]: I1002 11:45:44.737362 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.422267 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:06 crc kubenswrapper[4929]: E1002 11:46:06.423126 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff74db9-c703-44d9-9bcf-bc6ef951a464" containerName="collect-profiles" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.423140 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff74db9-c703-44d9-9bcf-bc6ef951a464" containerName="collect-profiles" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.423315 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff74db9-c703-44d9-9bcf-bc6ef951a464" containerName="collect-profiles" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.424456 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.431416 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.495977 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.496047 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.496167 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9ch7\" (UniqueName: \"kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.597188 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9ch7\" (UniqueName: \"kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.597248 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.597276 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.597718 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.597801 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.618640 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9ch7\" (UniqueName: \"kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7\") pod \"community-operators-rqtbc\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:06 crc kubenswrapper[4929]: I1002 11:46:06.756631 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:07 crc kubenswrapper[4929]: I1002 11:46:07.268246 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.207354 4929 generic.go:334] "Generic (PLEG): container finished" podID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerID="bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7" exitCode=0 Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.207436 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerDied","Data":"bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7"} Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.207923 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerStarted","Data":"4d17a528406033f96f96c0d2248bcfb2535261b3b0bd07b33d79f11fb8da7bfe"} Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.211703 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.620797 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.625713 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.637628 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd429\" (UniqueName: \"kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.637813 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.637854 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.645615 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.739609 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd429\" (UniqueName: \"kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.739695 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.739721 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.740406 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.740473 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.764443 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd429\" (UniqueName: \"kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429\") pod \"redhat-marketplace-c82pz\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:08 crc kubenswrapper[4929]: I1002 11:46:08.950766 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:09 crc kubenswrapper[4929]: I1002 11:46:09.185043 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:09 crc kubenswrapper[4929]: W1002 11:46:09.197185 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1144bf_0a73_4750_ab8d_1d4ec0bc95f5.slice/crio-7882e353f9114dd58716880e1a33dfdb09e691625083b4ae3aed3f46459e5dd2 WatchSource:0}: Error finding container 7882e353f9114dd58716880e1a33dfdb09e691625083b4ae3aed3f46459e5dd2: Status 404 returned error can't find the container with id 7882e353f9114dd58716880e1a33dfdb09e691625083b4ae3aed3f46459e5dd2 Oct 02 11:46:09 crc kubenswrapper[4929]: I1002 11:46:09.217041 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerStarted","Data":"7882e353f9114dd58716880e1a33dfdb09e691625083b4ae3aed3f46459e5dd2"} Oct 02 11:46:09 crc kubenswrapper[4929]: I1002 11:46:09.219451 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerStarted","Data":"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207"} Oct 02 11:46:10 crc kubenswrapper[4929]: I1002 11:46:10.231835 4929 generic.go:334] "Generic (PLEG): container finished" podID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerID="1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207" exitCode=0 Oct 02 11:46:10 crc kubenswrapper[4929]: I1002 11:46:10.231880 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerDied","Data":"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207"} Oct 02 11:46:10 crc kubenswrapper[4929]: I1002 11:46:10.235501 4929 generic.go:334] "Generic (PLEG): container finished" podID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerID="ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7" exitCode=0 Oct 02 11:46:10 crc kubenswrapper[4929]: I1002 11:46:10.235569 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerDied","Data":"ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7"} Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.214134 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.216614 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.229618 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.251821 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerStarted","Data":"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0"} Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.257296 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerStarted","Data":"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace"} Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.282593 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rqtbc" podStartSLOduration=2.834601219 podStartE2EDuration="5.282570442s" podCreationTimestamp="2025-10-02 11:46:06 +0000 UTC" firstStartedPulling="2025-10-02 11:46:08.211264131 +0000 UTC m=+2168.761630495" lastFinishedPulling="2025-10-02 11:46:10.659233354 +0000 UTC m=+2171.209599718" observedRunningTime="2025-10-02 11:46:11.277237131 +0000 UTC m=+2171.827603505" watchObservedRunningTime="2025-10-02 11:46:11.282570442 +0000 UTC m=+2171.832936806" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.380669 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.381059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.381245 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps6qc\" (UniqueName: \"kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.483371 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.483444 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps6qc\" (UniqueName: \"kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.483508 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.484166 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.484230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.519699 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps6qc\" (UniqueName: \"kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc\") pod \"redhat-operators-zfh6s\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.551691 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:11 crc kubenswrapper[4929]: I1002 11:46:11.820373 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.265410 4929 generic.go:334] "Generic (PLEG): container finished" podID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerID="39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace" exitCode=0 Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.265842 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerDied","Data":"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace"} Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.265883 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerStarted","Data":"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64"} Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.268528 4929 generic.go:334] "Generic (PLEG): container finished" podID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerID="aad335387bd18f8a4a10ec8abe78b62f1f8e8c4861401726aa454c0429c18221" exitCode=0 Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.269826 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerDied","Data":"aad335387bd18f8a4a10ec8abe78b62f1f8e8c4861401726aa454c0429c18221"} Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.269853 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerStarted","Data":"bcf20b5d580d113758174945cee2b394d35c0417a142256b7773a7ec9d061546"} Oct 02 11:46:12 crc kubenswrapper[4929]: I1002 11:46:12.290523 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c82pz" podStartSLOduration=2.851366666 podStartE2EDuration="4.290500392s" podCreationTimestamp="2025-10-02 11:46:08 +0000 UTC" firstStartedPulling="2025-10-02 11:46:10.236866409 +0000 UTC m=+2170.787232773" lastFinishedPulling="2025-10-02 11:46:11.676000135 +0000 UTC m=+2172.226366499" observedRunningTime="2025-10-02 11:46:12.287536178 +0000 UTC m=+2172.837902562" watchObservedRunningTime="2025-10-02 11:46:12.290500392 +0000 UTC m=+2172.840866756" Oct 02 11:46:13 crc kubenswrapper[4929]: I1002 11:46:13.279281 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerStarted","Data":"af90de6c60d4d0fd2d30206f1f986673130bdbfc81d7dedd5acd3fcee7210bdc"} Oct 02 11:46:14 crc kubenswrapper[4929]: I1002 11:46:14.289171 4929 generic.go:334] "Generic (PLEG): container finished" podID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerID="af90de6c60d4d0fd2d30206f1f986673130bdbfc81d7dedd5acd3fcee7210bdc" exitCode=0 Oct 02 11:46:14 crc kubenswrapper[4929]: I1002 11:46:14.289246 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerDied","Data":"af90de6c60d4d0fd2d30206f1f986673130bdbfc81d7dedd5acd3fcee7210bdc"} Oct 02 11:46:14 crc kubenswrapper[4929]: I1002 11:46:14.737191 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:46:14 crc kubenswrapper[4929]: I1002 11:46:14.737259 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:46:15 crc kubenswrapper[4929]: I1002 11:46:15.300998 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerStarted","Data":"84591d3645b91aa3b583ff79c47e9d98865130a7cce517240a5a17028980c8c3"} Oct 02 11:46:15 crc kubenswrapper[4929]: I1002 11:46:15.319888 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zfh6s" podStartSLOduration=1.841778495 podStartE2EDuration="4.319870792s" podCreationTimestamp="2025-10-02 11:46:11 +0000 UTC" firstStartedPulling="2025-10-02 11:46:12.269913408 +0000 UTC m=+2172.820279772" lastFinishedPulling="2025-10-02 11:46:14.748005705 +0000 UTC m=+2175.298372069" observedRunningTime="2025-10-02 11:46:15.318247356 +0000 UTC m=+2175.868613730" watchObservedRunningTime="2025-10-02 11:46:15.319870792 +0000 UTC m=+2175.870237156" Oct 02 11:46:16 crc kubenswrapper[4929]: I1002 11:46:16.757243 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:16 crc kubenswrapper[4929]: I1002 11:46:16.757289 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:16 crc kubenswrapper[4929]: I1002 11:46:16.800233 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:17 crc kubenswrapper[4929]: I1002 11:46:17.356303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:18 crc kubenswrapper[4929]: I1002 11:46:18.951634 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:18 crc kubenswrapper[4929]: I1002 11:46:18.951686 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:19 crc kubenswrapper[4929]: I1002 11:46:19.002143 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:19 crc kubenswrapper[4929]: I1002 11:46:19.372933 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:20 crc kubenswrapper[4929]: I1002 11:46:20.004853 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:20 crc kubenswrapper[4929]: I1002 11:46:20.005137 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rqtbc" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="registry-server" containerID="cri-o://4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0" gracePeriod=2 Oct 02 11:46:20 crc kubenswrapper[4929]: I1002 11:46:20.980099 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.003486 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.128678 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content\") pod \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.128720 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9ch7\" (UniqueName: \"kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7\") pod \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.128792 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities\") pod \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\" (UID: \"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.130512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities" (OuterVolumeSpecName: "utilities") pod "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" (UID: "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.141633 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7" (OuterVolumeSpecName: "kube-api-access-d9ch7") pod "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" (UID: "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b"). InnerVolumeSpecName "kube-api-access-d9ch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.180598 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" (UID: "aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.230809 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.230940 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.230978 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9ch7\" (UniqueName: \"kubernetes.io/projected/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b-kube-api-access-d9ch7\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345254 4929 generic.go:334] "Generic (PLEG): container finished" podID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerID="4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0" exitCode=0 Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345298 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerDied","Data":"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0"} Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqtbc" event={"ID":"aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b","Type":"ContainerDied","Data":"4d17a528406033f96f96c0d2248bcfb2535261b3b0bd07b33d79f11fb8da7bfe"} Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345345 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqtbc" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345368 4929 scope.go:117] "RemoveContainer" containerID="4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.345458 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c82pz" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="registry-server" containerID="cri-o://cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64" gracePeriod=2 Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.373178 4929 scope.go:117] "RemoveContainer" containerID="1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.388088 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.398460 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rqtbc"] Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.439012 4929 scope.go:117] "RemoveContainer" containerID="bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.468471 4929 scope.go:117] "RemoveContainer" containerID="4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0" Oct 02 11:46:21 crc kubenswrapper[4929]: E1002 11:46:21.469214 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0\": container with ID starting with 4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0 not found: ID does not exist" containerID="4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.469264 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0"} err="failed to get container status \"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0\": rpc error: code = NotFound desc = could not find container \"4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0\": container with ID starting with 4c791105b149dda5172c1eeb154de03fb05a0fd31e91102e4b54c868c2fface0 not found: ID does not exist" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.469299 4929 scope.go:117] "RemoveContainer" containerID="1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207" Oct 02 11:46:21 crc kubenswrapper[4929]: E1002 11:46:21.470044 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207\": container with ID starting with 1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207 not found: ID does not exist" containerID="1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.470127 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207"} err="failed to get container status \"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207\": rpc error: code = NotFound desc = could not find container \"1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207\": container with ID starting with 1e40e4f5bf5539a976a9340dd12035adfd226d5d1dee12fcea97bd72b4750207 not found: ID does not exist" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.470180 4929 scope.go:117] "RemoveContainer" containerID="bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7" Oct 02 11:46:21 crc kubenswrapper[4929]: E1002 11:46:21.470604 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7\": container with ID starting with bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7 not found: ID does not exist" containerID="bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.470649 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7"} err="failed to get container status \"bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7\": rpc error: code = NotFound desc = could not find container \"bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7\": container with ID starting with bb9aa06a53b194849a04f06d95420151923d685986ce762e8c01d4f6b31724f7 not found: ID does not exist" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.552953 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.553023 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.607483 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.762817 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.944652 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content\") pod \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.944709 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities\") pod \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.944778 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd429\" (UniqueName: \"kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429\") pod \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\" (UID: \"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5\") " Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.945809 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities" (OuterVolumeSpecName: "utilities") pod "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" (UID: "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.948905 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429" (OuterVolumeSpecName: "kube-api-access-kd429") pod "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" (UID: "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5"). InnerVolumeSpecName "kube-api-access-kd429". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:46:21 crc kubenswrapper[4929]: I1002 11:46:21.961571 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" (UID: "5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.046593 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.046646 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.046662 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd429\" (UniqueName: \"kubernetes.io/projected/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5-kube-api-access-kd429\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.173674 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" path="/var/lib/kubelet/pods/aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b/volumes" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.355307 4929 generic.go:334] "Generic (PLEG): container finished" podID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerID="cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64" exitCode=0 Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.355368 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerDied","Data":"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64"} Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.355465 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c82pz" event={"ID":"5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5","Type":"ContainerDied","Data":"7882e353f9114dd58716880e1a33dfdb09e691625083b4ae3aed3f46459e5dd2"} Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.355513 4929 scope.go:117] "RemoveContainer" containerID="cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.355691 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c82pz" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.385471 4929 scope.go:117] "RemoveContainer" containerID="39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.390458 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.398475 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c82pz"] Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.402511 4929 scope.go:117] "RemoveContainer" containerID="ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.408871 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.422294 4929 scope.go:117] "RemoveContainer" containerID="cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64" Oct 02 11:46:22 crc kubenswrapper[4929]: E1002 11:46:22.422945 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64\": container with ID starting with cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64 not found: ID does not exist" containerID="cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.423032 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64"} err="failed to get container status \"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64\": rpc error: code = NotFound desc = could not find container \"cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64\": container with ID starting with cafe2c41ee26f87e272015188b349305db02245887b497fe72704d77fed8bc64 not found: ID does not exist" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.423085 4929 scope.go:117] "RemoveContainer" containerID="39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace" Oct 02 11:46:22 crc kubenswrapper[4929]: E1002 11:46:22.424765 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace\": container with ID starting with 39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace not found: ID does not exist" containerID="39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.424877 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace"} err="failed to get container status \"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace\": rpc error: code = NotFound desc = could not find container \"39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace\": container with ID starting with 39d3e243ea1fdcc9bc37b7ded4315f33ff9c656a9407ee13664e0c0233909ace not found: ID does not exist" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.424929 4929 scope.go:117] "RemoveContainer" containerID="ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7" Oct 02 11:46:22 crc kubenswrapper[4929]: E1002 11:46:22.438707 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7\": container with ID starting with ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7 not found: ID does not exist" containerID="ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7" Oct 02 11:46:22 crc kubenswrapper[4929]: I1002 11:46:22.438808 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7"} err="failed to get container status \"ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7\": rpc error: code = NotFound desc = could not find container \"ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7\": container with ID starting with ee5616deca904e4eada557dc07da1c97353a0b6694e9d37d4489659688a245b7 not found: ID does not exist" Oct 02 11:46:23 crc kubenswrapper[4929]: I1002 11:46:23.006162 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:24 crc kubenswrapper[4929]: I1002 11:46:24.164471 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" path="/var/lib/kubelet/pods/5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5/volumes" Oct 02 11:46:24 crc kubenswrapper[4929]: I1002 11:46:24.371254 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zfh6s" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="registry-server" containerID="cri-o://84591d3645b91aa3b583ff79c47e9d98865130a7cce517240a5a17028980c8c3" gracePeriod=2 Oct 02 11:46:25 crc kubenswrapper[4929]: I1002 11:46:25.379604 4929 generic.go:334] "Generic (PLEG): container finished" podID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerID="84591d3645b91aa3b583ff79c47e9d98865130a7cce517240a5a17028980c8c3" exitCode=0 Oct 02 11:46:25 crc kubenswrapper[4929]: I1002 11:46:25.379687 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerDied","Data":"84591d3645b91aa3b583ff79c47e9d98865130a7cce517240a5a17028980c8c3"} Oct 02 11:46:25 crc kubenswrapper[4929]: I1002 11:46:25.843071 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.001601 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps6qc\" (UniqueName: \"kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc\") pod \"cd302fcf-329c-4b9b-b359-2b959bf68491\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.001681 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities\") pod \"cd302fcf-329c-4b9b-b359-2b959bf68491\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.001724 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content\") pod \"cd302fcf-329c-4b9b-b359-2b959bf68491\" (UID: \"cd302fcf-329c-4b9b-b359-2b959bf68491\") " Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.002643 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities" (OuterVolumeSpecName: "utilities") pod "cd302fcf-329c-4b9b-b359-2b959bf68491" (UID: "cd302fcf-329c-4b9b-b359-2b959bf68491"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.007395 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc" (OuterVolumeSpecName: "kube-api-access-ps6qc") pod "cd302fcf-329c-4b9b-b359-2b959bf68491" (UID: "cd302fcf-329c-4b9b-b359-2b959bf68491"). InnerVolumeSpecName "kube-api-access-ps6qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.087649 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd302fcf-329c-4b9b-b359-2b959bf68491" (UID: "cd302fcf-329c-4b9b-b359-2b959bf68491"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.103280 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps6qc\" (UniqueName: \"kubernetes.io/projected/cd302fcf-329c-4b9b-b359-2b959bf68491-kube-api-access-ps6qc\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.103316 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.103330 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd302fcf-329c-4b9b-b359-2b959bf68491-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.388324 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfh6s" event={"ID":"cd302fcf-329c-4b9b-b359-2b959bf68491","Type":"ContainerDied","Data":"bcf20b5d580d113758174945cee2b394d35c0417a142256b7773a7ec9d061546"} Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.388375 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfh6s" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.388384 4929 scope.go:117] "RemoveContainer" containerID="84591d3645b91aa3b583ff79c47e9d98865130a7cce517240a5a17028980c8c3" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.412787 4929 scope.go:117] "RemoveContainer" containerID="af90de6c60d4d0fd2d30206f1f986673130bdbfc81d7dedd5acd3fcee7210bdc" Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.414282 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.420975 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zfh6s"] Oct 02 11:46:26 crc kubenswrapper[4929]: I1002 11:46:26.431278 4929 scope.go:117] "RemoveContainer" containerID="aad335387bd18f8a4a10ec8abe78b62f1f8e8c4861401726aa454c0429c18221" Oct 02 11:46:28 crc kubenswrapper[4929]: I1002 11:46:28.167171 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" path="/var/lib/kubelet/pods/cd302fcf-329c-4b9b-b359-2b959bf68491/volumes" Oct 02 11:46:44 crc kubenswrapper[4929]: I1002 11:46:44.737186 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:46:44 crc kubenswrapper[4929]: I1002 11:46:44.737756 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:46:44 crc kubenswrapper[4929]: I1002 11:46:44.737836 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:46:44 crc kubenswrapper[4929]: I1002 11:46:44.738521 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:46:44 crc kubenswrapper[4929]: I1002 11:46:44.738589 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165" gracePeriod=600 Oct 02 11:46:45 crc kubenswrapper[4929]: I1002 11:46:45.529236 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165" exitCode=0 Oct 02 11:46:45 crc kubenswrapper[4929]: I1002 11:46:45.529293 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165"} Oct 02 11:46:45 crc kubenswrapper[4929]: I1002 11:46:45.529938 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199"} Oct 02 11:46:45 crc kubenswrapper[4929]: I1002 11:46:45.529999 4929 scope.go:117] "RemoveContainer" containerID="ed5cb39c064d25f6ff87d3f8c6c8c60fb1f246214f6fedb73ffc3e727a47d4b0" Oct 02 11:49:14 crc kubenswrapper[4929]: I1002 11:49:14.736752 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:49:14 crc kubenswrapper[4929]: I1002 11:49:14.737367 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:49:44 crc kubenswrapper[4929]: I1002 11:49:44.736546 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:49:44 crc kubenswrapper[4929]: I1002 11:49:44.737014 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:50:14 crc kubenswrapper[4929]: I1002 11:50:14.737358 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:50:14 crc kubenswrapper[4929]: I1002 11:50:14.739346 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:50:14 crc kubenswrapper[4929]: I1002 11:50:14.739504 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:50:14 crc kubenswrapper[4929]: I1002 11:50:14.740318 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:50:14 crc kubenswrapper[4929]: I1002 11:50:14.740484 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" gracePeriod=600 Oct 02 11:50:14 crc kubenswrapper[4929]: E1002 11:50:14.857543 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:50:15 crc kubenswrapper[4929]: I1002 11:50:15.020004 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" exitCode=0 Oct 02 11:50:15 crc kubenswrapper[4929]: I1002 11:50:15.020373 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199"} Oct 02 11:50:15 crc kubenswrapper[4929]: I1002 11:50:15.020494 4929 scope.go:117] "RemoveContainer" containerID="6da31fe02e5a524e585a14cfb339228963a2b369ab656c7717da73044298b165" Oct 02 11:50:15 crc kubenswrapper[4929]: I1002 11:50:15.021112 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:50:15 crc kubenswrapper[4929]: E1002 11:50:15.021336 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:50:27 crc kubenswrapper[4929]: I1002 11:50:27.157090 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:50:27 crc kubenswrapper[4929]: E1002 11:50:27.157974 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:50:39 crc kubenswrapper[4929]: I1002 11:50:39.157164 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:50:39 crc kubenswrapper[4929]: E1002 11:50:39.157826 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:50:53 crc kubenswrapper[4929]: I1002 11:50:53.156082 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:50:53 crc kubenswrapper[4929]: E1002 11:50:53.156819 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:51:06 crc kubenswrapper[4929]: I1002 11:51:06.156899 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:51:06 crc kubenswrapper[4929]: E1002 11:51:06.157513 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:51:17 crc kubenswrapper[4929]: I1002 11:51:17.156781 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:51:17 crc kubenswrapper[4929]: E1002 11:51:17.157423 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:51:28 crc kubenswrapper[4929]: I1002 11:51:28.157288 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:51:28 crc kubenswrapper[4929]: E1002 11:51:28.158374 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:51:43 crc kubenswrapper[4929]: I1002 11:51:43.156255 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:51:43 crc kubenswrapper[4929]: E1002 11:51:43.156839 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:51:57 crc kubenswrapper[4929]: I1002 11:51:57.156506 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:51:57 crc kubenswrapper[4929]: E1002 11:51:57.157183 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:52:09 crc kubenswrapper[4929]: I1002 11:52:09.156303 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:52:09 crc kubenswrapper[4929]: E1002 11:52:09.157033 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:52:24 crc kubenswrapper[4929]: I1002 11:52:24.157022 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:52:24 crc kubenswrapper[4929]: E1002 11:52:24.157724 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:52:39 crc kubenswrapper[4929]: I1002 11:52:39.156830 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:52:39 crc kubenswrapper[4929]: E1002 11:52:39.157559 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:52:53 crc kubenswrapper[4929]: I1002 11:52:53.156770 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:52:53 crc kubenswrapper[4929]: E1002 11:52:53.157405 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:53:05 crc kubenswrapper[4929]: I1002 11:53:05.157047 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:53:05 crc kubenswrapper[4929]: E1002 11:53:05.157806 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:53:19 crc kubenswrapper[4929]: I1002 11:53:19.157399 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:53:19 crc kubenswrapper[4929]: E1002 11:53:19.159038 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:53:34 crc kubenswrapper[4929]: I1002 11:53:34.156734 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:53:34 crc kubenswrapper[4929]: E1002 11:53:34.157725 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:53:47 crc kubenswrapper[4929]: I1002 11:53:47.157206 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:53:47 crc kubenswrapper[4929]: E1002 11:53:47.158342 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:54:00 crc kubenswrapper[4929]: I1002 11:54:00.159996 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:54:00 crc kubenswrapper[4929]: E1002 11:54:00.161076 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:54:14 crc kubenswrapper[4929]: I1002 11:54:14.157425 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:54:14 crc kubenswrapper[4929]: E1002 11:54:14.158563 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401270 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401734 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401747 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401759 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401767 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401785 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401792 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401803 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401808 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401818 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401824 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401844 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401851 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401864 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401869 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401876 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401882 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="extract-utilities" Oct 02 11:54:15 crc kubenswrapper[4929]: E1002 11:54:15.401891 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.401896 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="extract-content" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.402047 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1144bf-0a73-4750-ab8d-1d4ec0bc95f5" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.402067 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb3bbdc-3eaf-4e84-bb7a-228ef20c509b" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.402081 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd302fcf-329c-4b9b-b359-2b959bf68491" containerName="registry-server" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.403054 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.432162 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.504401 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.504696 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.504787 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmm97\" (UniqueName: \"kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.606526 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.606578 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.606621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmm97\" (UniqueName: \"kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.606974 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.607221 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.629307 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmm97\" (UniqueName: \"kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97\") pod \"certified-operators-84l69\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:15 crc kubenswrapper[4929]: I1002 11:54:15.726759 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:16 crc kubenswrapper[4929]: I1002 11:54:16.204389 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:16 crc kubenswrapper[4929]: I1002 11:54:16.788835 4929 generic.go:334] "Generic (PLEG): container finished" podID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerID="323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51" exitCode=0 Oct 02 11:54:16 crc kubenswrapper[4929]: I1002 11:54:16.788886 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerDied","Data":"323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51"} Oct 02 11:54:16 crc kubenswrapper[4929]: I1002 11:54:16.789137 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerStarted","Data":"00470576390cf5bdc833a649f683b9de5107cf6d5106d487a254ab86d44c89b4"} Oct 02 11:54:16 crc kubenswrapper[4929]: I1002 11:54:16.791425 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:54:18 crc kubenswrapper[4929]: I1002 11:54:18.816208 4929 generic.go:334] "Generic (PLEG): container finished" podID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerID="3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f" exitCode=0 Oct 02 11:54:18 crc kubenswrapper[4929]: I1002 11:54:18.816309 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerDied","Data":"3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f"} Oct 02 11:54:19 crc kubenswrapper[4929]: I1002 11:54:19.826847 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerStarted","Data":"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03"} Oct 02 11:54:19 crc kubenswrapper[4929]: I1002 11:54:19.844777 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84l69" podStartSLOduration=2.2864022520000002 podStartE2EDuration="4.844751483s" podCreationTimestamp="2025-10-02 11:54:15 +0000 UTC" firstStartedPulling="2025-10-02 11:54:16.791189286 +0000 UTC m=+2657.341555650" lastFinishedPulling="2025-10-02 11:54:19.349538517 +0000 UTC m=+2659.899904881" observedRunningTime="2025-10-02 11:54:19.840692527 +0000 UTC m=+2660.391058931" watchObservedRunningTime="2025-10-02 11:54:19.844751483 +0000 UTC m=+2660.395117867" Oct 02 11:54:25 crc kubenswrapper[4929]: I1002 11:54:25.727044 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:25 crc kubenswrapper[4929]: I1002 11:54:25.727303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:25 crc kubenswrapper[4929]: I1002 11:54:25.764661 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:25 crc kubenswrapper[4929]: I1002 11:54:25.900532 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:25 crc kubenswrapper[4929]: I1002 11:54:25.998349 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:27 crc kubenswrapper[4929]: I1002 11:54:27.156605 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:54:27 crc kubenswrapper[4929]: E1002 11:54:27.157789 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:54:27 crc kubenswrapper[4929]: I1002 11:54:27.877817 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84l69" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="registry-server" containerID="cri-o://4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03" gracePeriod=2 Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.257072 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.387135 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content\") pod \"4670744b-2665-4417-8ffa-b5884ec1cd33\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.387195 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities\") pod \"4670744b-2665-4417-8ffa-b5884ec1cd33\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.387221 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmm97\" (UniqueName: \"kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97\") pod \"4670744b-2665-4417-8ffa-b5884ec1cd33\" (UID: \"4670744b-2665-4417-8ffa-b5884ec1cd33\") " Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.387936 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities" (OuterVolumeSpecName: "utilities") pod "4670744b-2665-4417-8ffa-b5884ec1cd33" (UID: "4670744b-2665-4417-8ffa-b5884ec1cd33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.392537 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97" (OuterVolumeSpecName: "kube-api-access-qmm97") pod "4670744b-2665-4417-8ffa-b5884ec1cd33" (UID: "4670744b-2665-4417-8ffa-b5884ec1cd33"). InnerVolumeSpecName "kube-api-access-qmm97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.442102 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4670744b-2665-4417-8ffa-b5884ec1cd33" (UID: "4670744b-2665-4417-8ffa-b5884ec1cd33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.489042 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.489100 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4670744b-2665-4417-8ffa-b5884ec1cd33-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.489115 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmm97\" (UniqueName: \"kubernetes.io/projected/4670744b-2665-4417-8ffa-b5884ec1cd33-kube-api-access-qmm97\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.886705 4929 generic.go:334] "Generic (PLEG): container finished" podID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerID="4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03" exitCode=0 Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.886754 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerDied","Data":"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03"} Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.886800 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84l69" event={"ID":"4670744b-2665-4417-8ffa-b5884ec1cd33","Type":"ContainerDied","Data":"00470576390cf5bdc833a649f683b9de5107cf6d5106d487a254ab86d44c89b4"} Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.886822 4929 scope.go:117] "RemoveContainer" containerID="4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.887224 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84l69" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.917148 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.918475 4929 scope.go:117] "RemoveContainer" containerID="3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.923727 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84l69"] Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.944300 4929 scope.go:117] "RemoveContainer" containerID="323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.965765 4929 scope.go:117] "RemoveContainer" containerID="4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03" Oct 02 11:54:28 crc kubenswrapper[4929]: E1002 11:54:28.966300 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03\": container with ID starting with 4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03 not found: ID does not exist" containerID="4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.966346 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03"} err="failed to get container status \"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03\": rpc error: code = NotFound desc = could not find container \"4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03\": container with ID starting with 4eabdd21f6d9b850c7cc84304f445033a6776cf52748eefe512424ede8626a03 not found: ID does not exist" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.966378 4929 scope.go:117] "RemoveContainer" containerID="3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f" Oct 02 11:54:28 crc kubenswrapper[4929]: E1002 11:54:28.966716 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f\": container with ID starting with 3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f not found: ID does not exist" containerID="3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.966841 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f"} err="failed to get container status \"3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f\": rpc error: code = NotFound desc = could not find container \"3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f\": container with ID starting with 3b44063557748c9d5835e119b08a52508afe8e5bac4ec09d5dcd262a7b194c1f not found: ID does not exist" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.966876 4929 scope.go:117] "RemoveContainer" containerID="323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51" Oct 02 11:54:28 crc kubenswrapper[4929]: E1002 11:54:28.967471 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51\": container with ID starting with 323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51 not found: ID does not exist" containerID="323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51" Oct 02 11:54:28 crc kubenswrapper[4929]: I1002 11:54:28.967502 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51"} err="failed to get container status \"323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51\": rpc error: code = NotFound desc = could not find container \"323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51\": container with ID starting with 323adf3caf13a8bd68303bc902c9c74f9dbd4745b3411bbcf6f131986ab61c51 not found: ID does not exist" Oct 02 11:54:30 crc kubenswrapper[4929]: I1002 11:54:30.166215 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" path="/var/lib/kubelet/pods/4670744b-2665-4417-8ffa-b5884ec1cd33/volumes" Oct 02 11:54:39 crc kubenswrapper[4929]: I1002 11:54:39.157548 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:54:39 crc kubenswrapper[4929]: E1002 11:54:39.158375 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:54:50 crc kubenswrapper[4929]: I1002 11:54:50.159965 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:54:50 crc kubenswrapper[4929]: E1002 11:54:50.160736 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:55:02 crc kubenswrapper[4929]: I1002 11:55:02.156766 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:55:02 crc kubenswrapper[4929]: E1002 11:55:02.158222 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 11:55:16 crc kubenswrapper[4929]: I1002 11:55:16.157179 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:55:17 crc kubenswrapper[4929]: I1002 11:55:17.235236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42"} Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.799777 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:56:58 crc kubenswrapper[4929]: E1002 11:56:58.800644 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="extract-utilities" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.800659 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="extract-utilities" Oct 02 11:56:58 crc kubenswrapper[4929]: E1002 11:56:58.800677 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="registry-server" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.800684 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="registry-server" Oct 02 11:56:58 crc kubenswrapper[4929]: E1002 11:56:58.800699 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="extract-content" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.800706 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="extract-content" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.800877 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4670744b-2665-4417-8ffa-b5884ec1cd33" containerName="registry-server" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.803764 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.809728 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.862790 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcs6\" (UniqueName: \"kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.862842 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.862866 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.964509 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcs6\" (UniqueName: \"kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.964588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.964621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.965227 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.965288 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:58 crc kubenswrapper[4929]: I1002 11:56:58.986882 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcs6\" (UniqueName: \"kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6\") pod \"redhat-marketplace-674v6\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:59 crc kubenswrapper[4929]: I1002 11:56:59.125293 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:56:59 crc kubenswrapper[4929]: I1002 11:56:59.525554 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:57:00 crc kubenswrapper[4929]: I1002 11:57:00.036720 4929 generic.go:334] "Generic (PLEG): container finished" podID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerID="62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c" exitCode=0 Oct 02 11:57:00 crc kubenswrapper[4929]: I1002 11:57:00.036820 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerDied","Data":"62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c"} Oct 02 11:57:00 crc kubenswrapper[4929]: I1002 11:57:00.037131 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerStarted","Data":"32b25baf39a37f24611e2aaa56257dbdc735d88468d4fec446c6a0147ece24c3"} Oct 02 11:57:01 crc kubenswrapper[4929]: I1002 11:57:01.045934 4929 generic.go:334] "Generic (PLEG): container finished" podID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerID="66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db" exitCode=0 Oct 02 11:57:01 crc kubenswrapper[4929]: I1002 11:57:01.046178 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerDied","Data":"66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db"} Oct 02 11:57:02 crc kubenswrapper[4929]: I1002 11:57:02.057478 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerStarted","Data":"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4"} Oct 02 11:57:02 crc kubenswrapper[4929]: I1002 11:57:02.077802 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-674v6" podStartSLOduration=2.572311048 podStartE2EDuration="4.077781991s" podCreationTimestamp="2025-10-02 11:56:58 +0000 UTC" firstStartedPulling="2025-10-02 11:57:00.038883574 +0000 UTC m=+2820.589249958" lastFinishedPulling="2025-10-02 11:57:01.544354537 +0000 UTC m=+2822.094720901" observedRunningTime="2025-10-02 11:57:02.076873415 +0000 UTC m=+2822.627239779" watchObservedRunningTime="2025-10-02 11:57:02.077781991 +0000 UTC m=+2822.628148345" Oct 02 11:57:09 crc kubenswrapper[4929]: I1002 11:57:09.126294 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:09 crc kubenswrapper[4929]: I1002 11:57:09.126946 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:09 crc kubenswrapper[4929]: I1002 11:57:09.167581 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:10 crc kubenswrapper[4929]: I1002 11:57:10.180845 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:10 crc kubenswrapper[4929]: I1002 11:57:10.233039 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.153828 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-674v6" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="registry-server" containerID="cri-o://c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4" gracePeriod=2 Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.569233 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.646999 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities\") pod \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.647065 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content\") pod \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.647127 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxcs6\" (UniqueName: \"kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6\") pod \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\" (UID: \"a15373dd-fe2f-4dde-ae06-f91be538ecb3\") " Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.648096 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities" (OuterVolumeSpecName: "utilities") pod "a15373dd-fe2f-4dde-ae06-f91be538ecb3" (UID: "a15373dd-fe2f-4dde-ae06-f91be538ecb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.651794 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6" (OuterVolumeSpecName: "kube-api-access-gxcs6") pod "a15373dd-fe2f-4dde-ae06-f91be538ecb3" (UID: "a15373dd-fe2f-4dde-ae06-f91be538ecb3"). InnerVolumeSpecName "kube-api-access-gxcs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.660300 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a15373dd-fe2f-4dde-ae06-f91be538ecb3" (UID: "a15373dd-fe2f-4dde-ae06-f91be538ecb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.749125 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxcs6\" (UniqueName: \"kubernetes.io/projected/a15373dd-fe2f-4dde-ae06-f91be538ecb3-kube-api-access-gxcs6\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.749165 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:12 crc kubenswrapper[4929]: I1002 11:57:12.749175 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a15373dd-fe2f-4dde-ae06-f91be538ecb3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.151578 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.152054 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="registry-server" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.152082 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="registry-server" Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.152154 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="extract-content" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.152165 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="extract-content" Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.152179 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="extract-utilities" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.152188 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="extract-utilities" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.152391 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerName="registry-server" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.154814 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.185815 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.196631 4929 generic.go:334] "Generic (PLEG): container finished" podID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" containerID="c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4" exitCode=0 Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.196921 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerDied","Data":"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4"} Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.197043 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-674v6" event={"ID":"a15373dd-fe2f-4dde-ae06-f91be538ecb3","Type":"ContainerDied","Data":"32b25baf39a37f24611e2aaa56257dbdc735d88468d4fec446c6a0147ece24c3"} Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.197151 4929 scope.go:117] "RemoveContainer" containerID="c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.197264 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-674v6" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.220841 4929 scope.go:117] "RemoveContainer" containerID="66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.233023 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.238337 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-674v6"] Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.257414 4929 scope.go:117] "RemoveContainer" containerID="62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.258832 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.259059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpk9\" (UniqueName: \"kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.259160 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.277726 4929 scope.go:117] "RemoveContainer" containerID="c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4" Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.278316 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4\": container with ID starting with c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4 not found: ID does not exist" containerID="c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.278378 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4"} err="failed to get container status \"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4\": rpc error: code = NotFound desc = could not find container \"c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4\": container with ID starting with c7cea1fa054dc5c56425714a4d5f30d88949d9d034ee59b85a09d45781ecdda4 not found: ID does not exist" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.278447 4929 scope.go:117] "RemoveContainer" containerID="66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db" Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.278841 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db\": container with ID starting with 66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db not found: ID does not exist" containerID="66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.278869 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db"} err="failed to get container status \"66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db\": rpc error: code = NotFound desc = could not find container \"66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db\": container with ID starting with 66e3ee897059b2ff26c035106f23f0299e766ffef61273a79f2a4dda102416db not found: ID does not exist" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.278889 4929 scope.go:117] "RemoveContainer" containerID="62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c" Oct 02 11:57:13 crc kubenswrapper[4929]: E1002 11:57:13.279249 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c\": container with ID starting with 62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c not found: ID does not exist" containerID="62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.279268 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c"} err="failed to get container status \"62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c\": rpc error: code = NotFound desc = could not find container \"62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c\": container with ID starting with 62dc2732a4e7c392f087bfe2fe815975afffa659e39c0cce707a0efb7da05b6c not found: ID does not exist" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.359868 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpk9\" (UniqueName: \"kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.359919 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.359946 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.360425 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.360999 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.390778 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpk9\" (UniqueName: \"kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9\") pod \"redhat-operators-x5pkf\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.490555 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:13 crc kubenswrapper[4929]: I1002 11:57:13.943574 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:14 crc kubenswrapper[4929]: I1002 11:57:14.166219 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15373dd-fe2f-4dde-ae06-f91be538ecb3" path="/var/lib/kubelet/pods/a15373dd-fe2f-4dde-ae06-f91be538ecb3/volumes" Oct 02 11:57:14 crc kubenswrapper[4929]: I1002 11:57:14.205648 4929 generic.go:334] "Generic (PLEG): container finished" podID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerID="09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421" exitCode=0 Oct 02 11:57:14 crc kubenswrapper[4929]: I1002 11:57:14.205690 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerDied","Data":"09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421"} Oct 02 11:57:14 crc kubenswrapper[4929]: I1002 11:57:14.205711 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerStarted","Data":"eca2c9f3dea03c938534de5fdd8bb5d55737feeaa1ca410c5044ebcd3aa61ae0"} Oct 02 11:57:16 crc kubenswrapper[4929]: I1002 11:57:16.225638 4929 generic.go:334] "Generic (PLEG): container finished" podID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerID="b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee" exitCode=0 Oct 02 11:57:16 crc kubenswrapper[4929]: I1002 11:57:16.225735 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerDied","Data":"b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee"} Oct 02 11:57:17 crc kubenswrapper[4929]: I1002 11:57:17.246100 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerStarted","Data":"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591"} Oct 02 11:57:17 crc kubenswrapper[4929]: I1002 11:57:17.273646 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x5pkf" podStartSLOduration=1.847378081 podStartE2EDuration="4.273626513s" podCreationTimestamp="2025-10-02 11:57:13 +0000 UTC" firstStartedPulling="2025-10-02 11:57:14.207161135 +0000 UTC m=+2834.757527499" lastFinishedPulling="2025-10-02 11:57:16.633409567 +0000 UTC m=+2837.183775931" observedRunningTime="2025-10-02 11:57:17.270583515 +0000 UTC m=+2837.820949899" watchObservedRunningTime="2025-10-02 11:57:17.273626513 +0000 UTC m=+2837.823992877" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.387770 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.390218 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.406177 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.494338 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.494406 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjpf\" (UniqueName: \"kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.494428 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.596226 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.596290 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjpf\" (UniqueName: \"kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.596315 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.596756 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.596907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.617704 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjpf\" (UniqueName: \"kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf\") pod \"community-operators-jxrp9\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:22 crc kubenswrapper[4929]: I1002 11:57:22.722263 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:23 crc kubenswrapper[4929]: I1002 11:57:23.248197 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:23 crc kubenswrapper[4929]: I1002 11:57:23.305519 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerStarted","Data":"a364cc5f12471a75229b14a3cd0d9d841b46bfe5f0dcad5e8bc4e89a463598dc"} Oct 02 11:57:23 crc kubenswrapper[4929]: I1002 11:57:23.491424 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:23 crc kubenswrapper[4929]: I1002 11:57:23.491494 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:23 crc kubenswrapper[4929]: I1002 11:57:23.532188 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:24 crc kubenswrapper[4929]: I1002 11:57:24.316310 4929 generic.go:334] "Generic (PLEG): container finished" podID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerID="5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5" exitCode=0 Oct 02 11:57:24 crc kubenswrapper[4929]: I1002 11:57:24.316396 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerDied","Data":"5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5"} Oct 02 11:57:24 crc kubenswrapper[4929]: I1002 11:57:24.362531 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:25 crc kubenswrapper[4929]: I1002 11:57:25.330832 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerStarted","Data":"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1"} Oct 02 11:57:25 crc kubenswrapper[4929]: I1002 11:57:25.959625 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:26 crc kubenswrapper[4929]: I1002 11:57:26.341823 4929 generic.go:334] "Generic (PLEG): container finished" podID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerID="d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1" exitCode=0 Oct 02 11:57:26 crc kubenswrapper[4929]: I1002 11:57:26.341879 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerDied","Data":"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1"} Oct 02 11:57:26 crc kubenswrapper[4929]: I1002 11:57:26.343112 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x5pkf" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="registry-server" containerID="cri-o://57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591" gracePeriod=2 Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.224122 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.266169 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content\") pod \"ea78936b-f93c-461c-81d3-0473197ac5f9\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.266213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities\") pod \"ea78936b-f93c-461c-81d3-0473197ac5f9\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.266244 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwpk9\" (UniqueName: \"kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9\") pod \"ea78936b-f93c-461c-81d3-0473197ac5f9\" (UID: \"ea78936b-f93c-461c-81d3-0473197ac5f9\") " Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.267143 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities" (OuterVolumeSpecName: "utilities") pod "ea78936b-f93c-461c-81d3-0473197ac5f9" (UID: "ea78936b-f93c-461c-81d3-0473197ac5f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.271805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9" (OuterVolumeSpecName: "kube-api-access-hwpk9") pod "ea78936b-f93c-461c-81d3-0473197ac5f9" (UID: "ea78936b-f93c-461c-81d3-0473197ac5f9"). InnerVolumeSpecName "kube-api-access-hwpk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.352285 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerStarted","Data":"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825"} Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.356709 4929 generic.go:334] "Generic (PLEG): container finished" podID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerID="57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591" exitCode=0 Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.356755 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerDied","Data":"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591"} Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.356783 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5pkf" event={"ID":"ea78936b-f93c-461c-81d3-0473197ac5f9","Type":"ContainerDied","Data":"eca2c9f3dea03c938534de5fdd8bb5d55737feeaa1ca410c5044ebcd3aa61ae0"} Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.356802 4929 scope.go:117] "RemoveContainer" containerID="57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.356932 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5pkf" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.365034 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea78936b-f93c-461c-81d3-0473197ac5f9" (UID: "ea78936b-f93c-461c-81d3-0473197ac5f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.371659 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.371703 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea78936b-f93c-461c-81d3-0473197ac5f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.371716 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwpk9\" (UniqueName: \"kubernetes.io/projected/ea78936b-f93c-461c-81d3-0473197ac5f9-kube-api-access-hwpk9\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.381307 4929 scope.go:117] "RemoveContainer" containerID="b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.381705 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jxrp9" podStartSLOduration=2.900416967 podStartE2EDuration="5.381694686s" podCreationTimestamp="2025-10-02 11:57:22 +0000 UTC" firstStartedPulling="2025-10-02 11:57:24.318365309 +0000 UTC m=+2844.868731673" lastFinishedPulling="2025-10-02 11:57:26.799643028 +0000 UTC m=+2847.350009392" observedRunningTime="2025-10-02 11:57:27.378617318 +0000 UTC m=+2847.928983692" watchObservedRunningTime="2025-10-02 11:57:27.381694686 +0000 UTC m=+2847.932061050" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.403582 4929 scope.go:117] "RemoveContainer" containerID="09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.420095 4929 scope.go:117] "RemoveContainer" containerID="57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591" Oct 02 11:57:27 crc kubenswrapper[4929]: E1002 11:57:27.420561 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591\": container with ID starting with 57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591 not found: ID does not exist" containerID="57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.420608 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591"} err="failed to get container status \"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591\": rpc error: code = NotFound desc = could not find container \"57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591\": container with ID starting with 57b2eb7a966eca4266d194e911b7da1e6caa465f77c79b11dfd6de88da7e7591 not found: ID does not exist" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.420635 4929 scope.go:117] "RemoveContainer" containerID="b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee" Oct 02 11:57:27 crc kubenswrapper[4929]: E1002 11:57:27.421090 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee\": container with ID starting with b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee not found: ID does not exist" containerID="b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.421206 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee"} err="failed to get container status \"b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee\": rpc error: code = NotFound desc = could not find container \"b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee\": container with ID starting with b54b5613af89e02042ceec0967d38c52fe8ca50791b3fc6a4234a4a28023d8ee not found: ID does not exist" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.421346 4929 scope.go:117] "RemoveContainer" containerID="09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421" Oct 02 11:57:27 crc kubenswrapper[4929]: E1002 11:57:27.422182 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421\": container with ID starting with 09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421 not found: ID does not exist" containerID="09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.422220 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421"} err="failed to get container status \"09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421\": rpc error: code = NotFound desc = could not find container \"09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421\": container with ID starting with 09db119ec473142047e1baf63c47f3b8604affb0b84d925b3e03a593fbf11421 not found: ID does not exist" Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.691391 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:27 crc kubenswrapper[4929]: I1002 11:57:27.696694 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x5pkf"] Oct 02 11:57:28 crc kubenswrapper[4929]: I1002 11:57:28.164453 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" path="/var/lib/kubelet/pods/ea78936b-f93c-461c-81d3-0473197ac5f9/volumes" Oct 02 11:57:32 crc kubenswrapper[4929]: I1002 11:57:32.722672 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:32 crc kubenswrapper[4929]: I1002 11:57:32.723492 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:32 crc kubenswrapper[4929]: I1002 11:57:32.799873 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:33 crc kubenswrapper[4929]: I1002 11:57:33.452672 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:33 crc kubenswrapper[4929]: I1002 11:57:33.958538 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:35 crc kubenswrapper[4929]: I1002 11:57:35.414722 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jxrp9" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="registry-server" containerID="cri-o://2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825" gracePeriod=2 Oct 02 11:57:35 crc kubenswrapper[4929]: I1002 11:57:35.846058 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.033671 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities\") pod \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.034570 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities" (OuterVolumeSpecName: "utilities") pod "1bc910ce-e29a-440e-936c-fbba5ec3a4b0" (UID: "1bc910ce-e29a-440e-936c-fbba5ec3a4b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.034981 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content\") pod \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.035058 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkjpf\" (UniqueName: \"kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf\") pod \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\" (UID: \"1bc910ce-e29a-440e-936c-fbba5ec3a4b0\") " Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.035435 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.044637 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf" (OuterVolumeSpecName: "kube-api-access-xkjpf") pod "1bc910ce-e29a-440e-936c-fbba5ec3a4b0" (UID: "1bc910ce-e29a-440e-936c-fbba5ec3a4b0"). InnerVolumeSpecName "kube-api-access-xkjpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.104316 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bc910ce-e29a-440e-936c-fbba5ec3a4b0" (UID: "1bc910ce-e29a-440e-936c-fbba5ec3a4b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.136869 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.136905 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkjpf\" (UniqueName: \"kubernetes.io/projected/1bc910ce-e29a-440e-936c-fbba5ec3a4b0-kube-api-access-xkjpf\") on node \"crc\" DevicePath \"\"" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.427526 4929 generic.go:334] "Generic (PLEG): container finished" podID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerID="2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825" exitCode=0 Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.427614 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerDied","Data":"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825"} Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.427674 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxrp9" event={"ID":"1bc910ce-e29a-440e-936c-fbba5ec3a4b0","Type":"ContainerDied","Data":"a364cc5f12471a75229b14a3cd0d9d841b46bfe5f0dcad5e8bc4e89a463598dc"} Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.427702 4929 scope.go:117] "RemoveContainer" containerID="2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.427621 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxrp9" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.453365 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.465581 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jxrp9"] Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.465773 4929 scope.go:117] "RemoveContainer" containerID="d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.483007 4929 scope.go:117] "RemoveContainer" containerID="5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.513546 4929 scope.go:117] "RemoveContainer" containerID="2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825" Oct 02 11:57:36 crc kubenswrapper[4929]: E1002 11:57:36.514216 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825\": container with ID starting with 2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825 not found: ID does not exist" containerID="2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.514292 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825"} err="failed to get container status \"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825\": rpc error: code = NotFound desc = could not find container \"2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825\": container with ID starting with 2ab7012caa7f866f5f0a6b7836a2a2898766df610597491e7393a40d637f5825 not found: ID does not exist" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.514357 4929 scope.go:117] "RemoveContainer" containerID="d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1" Oct 02 11:57:36 crc kubenswrapper[4929]: E1002 11:57:36.514829 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1\": container with ID starting with d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1 not found: ID does not exist" containerID="d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.514900 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1"} err="failed to get container status \"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1\": rpc error: code = NotFound desc = could not find container \"d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1\": container with ID starting with d984e9bbc31ab8846e0aee9e9da51a66b28a729f42ebfc9365016c3fb7061de1 not found: ID does not exist" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.515012 4929 scope.go:117] "RemoveContainer" containerID="5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5" Oct 02 11:57:36 crc kubenswrapper[4929]: E1002 11:57:36.515343 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5\": container with ID starting with 5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5 not found: ID does not exist" containerID="5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5" Oct 02 11:57:36 crc kubenswrapper[4929]: I1002 11:57:36.515387 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5"} err="failed to get container status \"5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5\": rpc error: code = NotFound desc = could not find container \"5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5\": container with ID starting with 5a251692eb89ab473f8a882ad1df2e9ed500c4ec06d7c8cac361af86ac73bef5 not found: ID does not exist" Oct 02 11:57:38 crc kubenswrapper[4929]: I1002 11:57:38.164996 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" path="/var/lib/kubelet/pods/1bc910ce-e29a-440e-936c-fbba5ec3a4b0/volumes" Oct 02 11:57:44 crc kubenswrapper[4929]: I1002 11:57:44.737171 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:57:44 crc kubenswrapper[4929]: I1002 11:57:44.737708 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:58:14 crc kubenswrapper[4929]: I1002 11:58:14.736911 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:58:14 crc kubenswrapper[4929]: I1002 11:58:14.737386 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.736819 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.737356 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.737410 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.738063 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.738115 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42" gracePeriod=600 Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.937429 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42" exitCode=0 Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.937473 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42"} Oct 02 11:58:44 crc kubenswrapper[4929]: I1002 11:58:44.937503 4929 scope.go:117] "RemoveContainer" containerID="09e0c006ec3c8af68c81461a95a4cd813a50218400a783c181ae8a3a1ce4d199" Oct 02 11:58:45 crc kubenswrapper[4929]: I1002 11:58:45.948605 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd"} Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.177176 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc"] Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178143 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178162 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178211 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178220 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178236 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178243 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178253 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178262 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178278 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178285 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4929]: E1002 12:00:00.178299 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178306 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178472 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc910ce-e29a-440e-936c-fbba5ec3a4b0" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.178490 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea78936b-f93c-461c-81d3-0473197ac5f9" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.179086 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.184708 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.184721 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.188765 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc"] Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.294461 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.294539 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hww5f\" (UniqueName: \"kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.294584 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.395854 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hww5f\" (UniqueName: \"kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.395924 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.396029 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.397373 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.401601 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.411339 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hww5f\" (UniqueName: \"kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f\") pod \"collect-profiles-29323440-4grmc\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.500389 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:00 crc kubenswrapper[4929]: I1002 12:00:00.891143 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc"] Oct 02 12:00:01 crc kubenswrapper[4929]: I1002 12:00:01.463027 4929 generic.go:334] "Generic (PLEG): container finished" podID="8f69a79f-60ac-4096-aeb0-0f1edb98dd00" containerID="d6b9730bc85ccd2ae176f6b970ebd605b22580c4201b12f95d560e756e37fc94" exitCode=0 Oct 02 12:00:01 crc kubenswrapper[4929]: I1002 12:00:01.463113 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" event={"ID":"8f69a79f-60ac-4096-aeb0-0f1edb98dd00","Type":"ContainerDied","Data":"d6b9730bc85ccd2ae176f6b970ebd605b22580c4201b12f95d560e756e37fc94"} Oct 02 12:00:01 crc kubenswrapper[4929]: I1002 12:00:01.463355 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" event={"ID":"8f69a79f-60ac-4096-aeb0-0f1edb98dd00","Type":"ContainerStarted","Data":"4938381ada820193b5208aa32a1b5eea06f97f1e9b300e5af8c67f9ca51576dc"} Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.781225 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.825836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hww5f\" (UniqueName: \"kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f\") pod \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.825951 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume\") pod \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.826805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f69a79f-60ac-4096-aeb0-0f1edb98dd00" (UID: "8f69a79f-60ac-4096-aeb0-0f1edb98dd00"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.833906 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f" (OuterVolumeSpecName: "kube-api-access-hww5f") pod "8f69a79f-60ac-4096-aeb0-0f1edb98dd00" (UID: "8f69a79f-60ac-4096-aeb0-0f1edb98dd00"). InnerVolumeSpecName "kube-api-access-hww5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.926707 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume\") pod \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\" (UID: \"8f69a79f-60ac-4096-aeb0-0f1edb98dd00\") " Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.927129 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hww5f\" (UniqueName: \"kubernetes.io/projected/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-kube-api-access-hww5f\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.927141 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:02 crc kubenswrapper[4929]: I1002 12:00:02.935695 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f69a79f-60ac-4096-aeb0-0f1edb98dd00" (UID: "8f69a79f-60ac-4096-aeb0-0f1edb98dd00"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.028688 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f69a79f-60ac-4096-aeb0-0f1edb98dd00-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.484404 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" event={"ID":"8f69a79f-60ac-4096-aeb0-0f1edb98dd00","Type":"ContainerDied","Data":"4938381ada820193b5208aa32a1b5eea06f97f1e9b300e5af8c67f9ca51576dc"} Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.484762 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4938381ada820193b5208aa32a1b5eea06f97f1e9b300e5af8c67f9ca51576dc" Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.484489 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc" Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.852190 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh"] Oct 02 12:00:03 crc kubenswrapper[4929]: I1002 12:00:03.857041 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-8v8gh"] Oct 02 12:00:04 crc kubenswrapper[4929]: I1002 12:00:04.175587 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0fd470-ccd3-4178-8aec-422779131298" path="/var/lib/kubelet/pods/5a0fd470-ccd3-4178-8aec-422779131298/volumes" Oct 02 12:00:14 crc kubenswrapper[4929]: I1002 12:00:14.685286 4929 scope.go:117] "RemoveContainer" containerID="7e45bc0dedef809659788fe293fc565bb44e69320e939136a117b9f84cfa7731" Oct 02 12:00:44 crc kubenswrapper[4929]: I1002 12:00:44.737160 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:00:44 crc kubenswrapper[4929]: I1002 12:00:44.737568 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:01:14 crc kubenswrapper[4929]: I1002 12:01:14.736931 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:01:14 crc kubenswrapper[4929]: I1002 12:01:14.737639 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:01:44 crc kubenswrapper[4929]: I1002 12:01:44.736419 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:01:44 crc kubenswrapper[4929]: I1002 12:01:44.736920 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:01:44 crc kubenswrapper[4929]: I1002 12:01:44.737035 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:01:44 crc kubenswrapper[4929]: I1002 12:01:44.737650 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:01:44 crc kubenswrapper[4929]: I1002 12:01:44.737707 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" gracePeriod=600 Oct 02 12:01:44 crc kubenswrapper[4929]: E1002 12:01:44.869190 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:01:45 crc kubenswrapper[4929]: I1002 12:01:45.192327 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" exitCode=0 Oct 02 12:01:45 crc kubenswrapper[4929]: I1002 12:01:45.192368 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd"} Oct 02 12:01:45 crc kubenswrapper[4929]: I1002 12:01:45.192404 4929 scope.go:117] "RemoveContainer" containerID="4d57889afc11f842043961ae2a5f223d1d9fb98239a13c2cb57d89197046bf42" Oct 02 12:01:45 crc kubenswrapper[4929]: I1002 12:01:45.192913 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:01:45 crc kubenswrapper[4929]: E1002 12:01:45.193217 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:02:00 crc kubenswrapper[4929]: I1002 12:02:00.160371 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:02:00 crc kubenswrapper[4929]: E1002 12:02:00.162540 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:02:15 crc kubenswrapper[4929]: I1002 12:02:15.156062 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:02:15 crc kubenswrapper[4929]: E1002 12:02:15.156760 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:02:26 crc kubenswrapper[4929]: I1002 12:02:26.157000 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:02:26 crc kubenswrapper[4929]: E1002 12:02:26.159592 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:02:40 crc kubenswrapper[4929]: I1002 12:02:40.159638 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:02:40 crc kubenswrapper[4929]: E1002 12:02:40.160352 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:02:52 crc kubenswrapper[4929]: I1002 12:02:52.157046 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:02:52 crc kubenswrapper[4929]: E1002 12:02:52.157722 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:03:06 crc kubenswrapper[4929]: I1002 12:03:06.157492 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:03:06 crc kubenswrapper[4929]: E1002 12:03:06.158660 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:03:19 crc kubenswrapper[4929]: I1002 12:03:19.157064 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:03:19 crc kubenswrapper[4929]: E1002 12:03:19.157685 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:03:31 crc kubenswrapper[4929]: I1002 12:03:31.156451 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:03:31 crc kubenswrapper[4929]: E1002 12:03:31.158828 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:03:43 crc kubenswrapper[4929]: I1002 12:03:43.156685 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:03:43 crc kubenswrapper[4929]: E1002 12:03:43.157387 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:03:57 crc kubenswrapper[4929]: I1002 12:03:57.156433 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:03:57 crc kubenswrapper[4929]: E1002 12:03:57.157090 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:04:11 crc kubenswrapper[4929]: I1002 12:04:11.156395 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:04:11 crc kubenswrapper[4929]: E1002 12:04:11.157173 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:04:25 crc kubenswrapper[4929]: I1002 12:04:25.156243 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:04:25 crc kubenswrapper[4929]: E1002 12:04:25.156898 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:04:37 crc kubenswrapper[4929]: I1002 12:04:37.157095 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:04:37 crc kubenswrapper[4929]: E1002 12:04:37.157745 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:04:51 crc kubenswrapper[4929]: I1002 12:04:51.156417 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:04:51 crc kubenswrapper[4929]: E1002 12:04:51.157343 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:05:06 crc kubenswrapper[4929]: I1002 12:05:06.157602 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:05:06 crc kubenswrapper[4929]: E1002 12:05:06.158220 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:05:18 crc kubenswrapper[4929]: I1002 12:05:18.156899 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:05:18 crc kubenswrapper[4929]: E1002 12:05:18.157937 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.588347 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:30 crc kubenswrapper[4929]: E1002 12:05:30.590102 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f69a79f-60ac-4096-aeb0-0f1edb98dd00" containerName="collect-profiles" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.590151 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f69a79f-60ac-4096-aeb0-0f1edb98dd00" containerName="collect-profiles" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.590385 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f69a79f-60ac-4096-aeb0-0f1edb98dd00" containerName="collect-profiles" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.591763 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.611838 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.710012 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2wn\" (UniqueName: \"kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.710074 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.710101 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.811571 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.811700 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2wn\" (UniqueName: \"kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.811741 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.812143 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.812169 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.832289 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2wn\" (UniqueName: \"kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn\") pod \"certified-operators-8sc89\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:30 crc kubenswrapper[4929]: I1002 12:05:30.911784 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.157111 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:05:31 crc kubenswrapper[4929]: E1002 12:05:31.157681 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.399199 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.841210 4929 generic.go:334] "Generic (PLEG): container finished" podID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerID="a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589" exitCode=0 Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.841273 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerDied","Data":"a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589"} Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.841468 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerStarted","Data":"8cfc0c0be5934b8b70502c45651117120c50fa47c69161c24b5e6e23eebc53c6"} Oct 02 12:05:31 crc kubenswrapper[4929]: I1002 12:05:31.843221 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:05:32 crc kubenswrapper[4929]: I1002 12:05:32.849801 4929 generic.go:334] "Generic (PLEG): container finished" podID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerID="aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c" exitCode=0 Oct 02 12:05:32 crc kubenswrapper[4929]: I1002 12:05:32.849842 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerDied","Data":"aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c"} Oct 02 12:05:33 crc kubenswrapper[4929]: I1002 12:05:33.858784 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerStarted","Data":"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b"} Oct 02 12:05:33 crc kubenswrapper[4929]: I1002 12:05:33.875707 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8sc89" podStartSLOduration=2.033160245 podStartE2EDuration="3.87569059s" podCreationTimestamp="2025-10-02 12:05:30 +0000 UTC" firstStartedPulling="2025-10-02 12:05:31.842902095 +0000 UTC m=+3332.393268469" lastFinishedPulling="2025-10-02 12:05:33.68543245 +0000 UTC m=+3334.235798814" observedRunningTime="2025-10-02 12:05:33.875434023 +0000 UTC m=+3334.425800397" watchObservedRunningTime="2025-10-02 12:05:33.87569059 +0000 UTC m=+3334.426056954" Oct 02 12:05:40 crc kubenswrapper[4929]: I1002 12:05:40.911974 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:40 crc kubenswrapper[4929]: I1002 12:05:40.912493 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:40 crc kubenswrapper[4929]: I1002 12:05:40.951951 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:41 crc kubenswrapper[4929]: I1002 12:05:41.954367 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:41 crc kubenswrapper[4929]: I1002 12:05:41.994666 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:43 crc kubenswrapper[4929]: I1002 12:05:43.929004 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8sc89" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="registry-server" containerID="cri-o://4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b" gracePeriod=2 Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.303938 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.408440 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities\") pod \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.409637 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities" (OuterVolumeSpecName: "utilities") pod "9e3276c4-0ece-43c3-ac6b-060c7fe23164" (UID: "9e3276c4-0ece-43c3-ac6b-060c7fe23164"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.409889 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content\") pod \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.409944 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr2wn\" (UniqueName: \"kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn\") pod \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\" (UID: \"9e3276c4-0ece-43c3-ac6b-060c7fe23164\") " Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.410320 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.416739 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn" (OuterVolumeSpecName: "kube-api-access-hr2wn") pod "9e3276c4-0ece-43c3-ac6b-060c7fe23164" (UID: "9e3276c4-0ece-43c3-ac6b-060c7fe23164"). InnerVolumeSpecName "kube-api-access-hr2wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.457344 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e3276c4-0ece-43c3-ac6b-060c7fe23164" (UID: "9e3276c4-0ece-43c3-ac6b-060c7fe23164"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.511178 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e3276c4-0ece-43c3-ac6b-060c7fe23164-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.511230 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr2wn\" (UniqueName: \"kubernetes.io/projected/9e3276c4-0ece-43c3-ac6b-060c7fe23164-kube-api-access-hr2wn\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.936808 4929 generic.go:334] "Generic (PLEG): container finished" podID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerID="4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b" exitCode=0 Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.936858 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerDied","Data":"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b"} Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.936886 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sc89" event={"ID":"9e3276c4-0ece-43c3-ac6b-060c7fe23164","Type":"ContainerDied","Data":"8cfc0c0be5934b8b70502c45651117120c50fa47c69161c24b5e6e23eebc53c6"} Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.936889 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sc89" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.936902 4929 scope.go:117] "RemoveContainer" containerID="4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.972243 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.976150 4929 scope.go:117] "RemoveContainer" containerID="aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c" Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.978598 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8sc89"] Oct 02 12:05:44 crc kubenswrapper[4929]: I1002 12:05:44.994664 4929 scope.go:117] "RemoveContainer" containerID="a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.014172 4929 scope.go:117] "RemoveContainer" containerID="4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b" Oct 02 12:05:45 crc kubenswrapper[4929]: E1002 12:05:45.014605 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b\": container with ID starting with 4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b not found: ID does not exist" containerID="4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.014650 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b"} err="failed to get container status \"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b\": rpc error: code = NotFound desc = could not find container \"4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b\": container with ID starting with 4cb59938d9d9e59307b80e763c2653d66c6fac610a634ea2f52c5022bf8da50b not found: ID does not exist" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.014677 4929 scope.go:117] "RemoveContainer" containerID="aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c" Oct 02 12:05:45 crc kubenswrapper[4929]: E1002 12:05:45.015107 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c\": container with ID starting with aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c not found: ID does not exist" containerID="aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.015140 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c"} err="failed to get container status \"aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c\": rpc error: code = NotFound desc = could not find container \"aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c\": container with ID starting with aa2837d2dc274bbc7a86b8ef93b350b08c35515843e1421d9f21f0d23155992c not found: ID does not exist" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.015184 4929 scope.go:117] "RemoveContainer" containerID="a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589" Oct 02 12:05:45 crc kubenswrapper[4929]: E1002 12:05:45.015535 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589\": container with ID starting with a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589 not found: ID does not exist" containerID="a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589" Oct 02 12:05:45 crc kubenswrapper[4929]: I1002 12:05:45.015572 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589"} err="failed to get container status \"a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589\": rpc error: code = NotFound desc = could not find container \"a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589\": container with ID starting with a681085b494c1dc148b8d97b4d6fd9bbccee6bb8b0ecfaff769c27fb84260589 not found: ID does not exist" Oct 02 12:05:46 crc kubenswrapper[4929]: I1002 12:05:46.156511 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:05:46 crc kubenswrapper[4929]: E1002 12:05:46.156713 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:05:46 crc kubenswrapper[4929]: I1002 12:05:46.165140 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" path="/var/lib/kubelet/pods/9e3276c4-0ece-43c3-ac6b-060c7fe23164/volumes" Oct 02 12:06:00 crc kubenswrapper[4929]: I1002 12:06:00.168888 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:06:00 crc kubenswrapper[4929]: E1002 12:06:00.170059 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:06:13 crc kubenswrapper[4929]: I1002 12:06:13.156046 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:06:13 crc kubenswrapper[4929]: E1002 12:06:13.156750 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:06:26 crc kubenswrapper[4929]: I1002 12:06:26.156695 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:06:26 crc kubenswrapper[4929]: E1002 12:06:26.158388 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:06:40 crc kubenswrapper[4929]: I1002 12:06:40.159426 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:06:40 crc kubenswrapper[4929]: E1002 12:06:40.160098 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:06:54 crc kubenswrapper[4929]: I1002 12:06:54.156510 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:06:54 crc kubenswrapper[4929]: I1002 12:06:54.398499 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7"} Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.744594 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:32 crc kubenswrapper[4929]: E1002 12:07:32.745516 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="extract-utilities" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.745532 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="extract-utilities" Oct 02 12:07:32 crc kubenswrapper[4929]: E1002 12:07:32.745563 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="extract-content" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.745571 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="extract-content" Oct 02 12:07:32 crc kubenswrapper[4929]: E1002 12:07:32.745586 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="registry-server" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.745594 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="registry-server" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.745827 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3276c4-0ece-43c3-ac6b-060c7fe23164" containerName="registry-server" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.748115 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.753468 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.844645 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.844781 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.844831 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n974v\" (UniqueName: \"kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.945761 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.946215 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.946294 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n974v\" (UniqueName: \"kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.946376 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.946636 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:32 crc kubenswrapper[4929]: I1002 12:07:32.966547 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n974v\" (UniqueName: \"kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v\") pod \"redhat-operators-lpjmh\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:33 crc kubenswrapper[4929]: I1002 12:07:33.069764 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:33 crc kubenswrapper[4929]: I1002 12:07:33.487555 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:33 crc kubenswrapper[4929]: I1002 12:07:33.684796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerStarted","Data":"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c"} Oct 02 12:07:33 crc kubenswrapper[4929]: I1002 12:07:33.684839 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerStarted","Data":"7abfece731b433f4c468e0ed40f12e75e5feab1ae55c5ee13e3fe8d1d3d43a4c"} Oct 02 12:07:34 crc kubenswrapper[4929]: I1002 12:07:34.704169 4929 generic.go:334] "Generic (PLEG): container finished" podID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerID="97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c" exitCode=0 Oct 02 12:07:34 crc kubenswrapper[4929]: I1002 12:07:34.704288 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerDied","Data":"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c"} Oct 02 12:07:36 crc kubenswrapper[4929]: I1002 12:07:36.727369 4929 generic.go:334] "Generic (PLEG): container finished" podID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerID="a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e" exitCode=0 Oct 02 12:07:36 crc kubenswrapper[4929]: I1002 12:07:36.727436 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerDied","Data":"a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e"} Oct 02 12:07:37 crc kubenswrapper[4929]: I1002 12:07:37.737350 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerStarted","Data":"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b"} Oct 02 12:07:37 crc kubenswrapper[4929]: I1002 12:07:37.759138 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lpjmh" podStartSLOduration=3.2591220229999998 podStartE2EDuration="5.759122776s" podCreationTimestamp="2025-10-02 12:07:32 +0000 UTC" firstStartedPulling="2025-10-02 12:07:34.705946686 +0000 UTC m=+3455.256313050" lastFinishedPulling="2025-10-02 12:07:37.205947419 +0000 UTC m=+3457.756313803" observedRunningTime="2025-10-02 12:07:37.75858442 +0000 UTC m=+3458.308950804" watchObservedRunningTime="2025-10-02 12:07:37.759122776 +0000 UTC m=+3458.309489140" Oct 02 12:07:43 crc kubenswrapper[4929]: I1002 12:07:43.070282 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:43 crc kubenswrapper[4929]: I1002 12:07:43.070845 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:43 crc kubenswrapper[4929]: I1002 12:07:43.115601 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:43 crc kubenswrapper[4929]: I1002 12:07:43.817101 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:43 crc kubenswrapper[4929]: I1002 12:07:43.857440 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:45 crc kubenswrapper[4929]: I1002 12:07:45.790706 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lpjmh" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="registry-server" containerID="cri-o://36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b" gracePeriod=2 Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.160427 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.338829 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n974v\" (UniqueName: \"kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v\") pod \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.339373 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities\") pod \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.339521 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content\") pod \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\" (UID: \"f2874ec5-dfa5-46fe-983c-8206b477f1ea\") " Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.340330 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities" (OuterVolumeSpecName: "utilities") pod "f2874ec5-dfa5-46fe-983c-8206b477f1ea" (UID: "f2874ec5-dfa5-46fe-983c-8206b477f1ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.346245 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v" (OuterVolumeSpecName: "kube-api-access-n974v") pod "f2874ec5-dfa5-46fe-983c-8206b477f1ea" (UID: "f2874ec5-dfa5-46fe-983c-8206b477f1ea"). InnerVolumeSpecName "kube-api-access-n974v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.435126 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2874ec5-dfa5-46fe-983c-8206b477f1ea" (UID: "f2874ec5-dfa5-46fe-983c-8206b477f1ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.441363 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.441402 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n974v\" (UniqueName: \"kubernetes.io/projected/f2874ec5-dfa5-46fe-983c-8206b477f1ea-kube-api-access-n974v\") on node \"crc\" DevicePath \"\"" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.441419 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2874ec5-dfa5-46fe-983c-8206b477f1ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.801588 4929 generic.go:334] "Generic (PLEG): container finished" podID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerID="36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b" exitCode=0 Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.801637 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpjmh" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.801657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerDied","Data":"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b"} Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.802191 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpjmh" event={"ID":"f2874ec5-dfa5-46fe-983c-8206b477f1ea","Type":"ContainerDied","Data":"7abfece731b433f4c468e0ed40f12e75e5feab1ae55c5ee13e3fe8d1d3d43a4c"} Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.802235 4929 scope.go:117] "RemoveContainer" containerID="36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.818489 4929 scope.go:117] "RemoveContainer" containerID="a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.837778 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.841344 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lpjmh"] Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.867456 4929 scope.go:117] "RemoveContainer" containerID="97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.890246 4929 scope.go:117] "RemoveContainer" containerID="36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b" Oct 02 12:07:46 crc kubenswrapper[4929]: E1002 12:07:46.890949 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b\": container with ID starting with 36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b not found: ID does not exist" containerID="36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.891539 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b"} err="failed to get container status \"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b\": rpc error: code = NotFound desc = could not find container \"36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b\": container with ID starting with 36126904b128228f426afce9c311a27d412f9d771a8bab810d2ef8787e276a3b not found: ID does not exist" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.891697 4929 scope.go:117] "RemoveContainer" containerID="a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e" Oct 02 12:07:46 crc kubenswrapper[4929]: E1002 12:07:46.902168 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e\": container with ID starting with a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e not found: ID does not exist" containerID="a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.902539 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e"} err="failed to get container status \"a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e\": rpc error: code = NotFound desc = could not find container \"a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e\": container with ID starting with a4a1beed3ef50127b96547ac5261049fa8ccd667018ce0f722455f2bfebae79e not found: ID does not exist" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.902597 4929 scope.go:117] "RemoveContainer" containerID="97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c" Oct 02 12:07:46 crc kubenswrapper[4929]: E1002 12:07:46.903024 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c\": container with ID starting with 97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c not found: ID does not exist" containerID="97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c" Oct 02 12:07:46 crc kubenswrapper[4929]: I1002 12:07:46.903106 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c"} err="failed to get container status \"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c\": rpc error: code = NotFound desc = could not find container \"97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c\": container with ID starting with 97d78c399fdc8b40b85c982bbc0bd7a260961e302a6dac62814c6a2faf01bf9c not found: ID does not exist" Oct 02 12:07:48 crc kubenswrapper[4929]: I1002 12:07:48.164640 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" path="/var/lib/kubelet/pods/f2874ec5-dfa5-46fe-983c-8206b477f1ea/volumes" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.469102 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:08 crc kubenswrapper[4929]: E1002 12:08:08.472878 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="registry-server" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.472933 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="registry-server" Oct 02 12:08:08 crc kubenswrapper[4929]: E1002 12:08:08.473010 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="extract-content" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.473022 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="extract-content" Oct 02 12:08:08 crc kubenswrapper[4929]: E1002 12:08:08.473032 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="extract-utilities" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.473040 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="extract-utilities" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.473351 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2874ec5-dfa5-46fe-983c-8206b477f1ea" containerName="registry-server" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.475013 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.475817 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.587514 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvdp\" (UniqueName: \"kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.587667 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.587704 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.688652 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.688706 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.688746 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvdp\" (UniqueName: \"kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.689192 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.689250 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.706928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvdp\" (UniqueName: \"kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp\") pod \"community-operators-2khqx\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:08 crc kubenswrapper[4929]: I1002 12:08:08.802183 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.282191 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.482745 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.487696 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.500090 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.601906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87rmq\" (UniqueName: \"kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.601996 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.602106 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.704108 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87rmq\" (UniqueName: \"kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.704383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.704484 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.704818 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.704897 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.724777 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87rmq\" (UniqueName: \"kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq\") pod \"redhat-marketplace-ck7lr\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.827213 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.963282 4929 generic.go:334] "Generic (PLEG): container finished" podID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerID="828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da" exitCode=0 Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.963333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerDied","Data":"828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da"} Oct 02 12:08:09 crc kubenswrapper[4929]: I1002 12:08:09.963364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerStarted","Data":"8dd6862e60969de1abe91a19483000bc0213b0425f9619d3e574bc6734ebc19d"} Oct 02 12:08:10 crc kubenswrapper[4929]: I1002 12:08:10.048775 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:10 crc kubenswrapper[4929]: I1002 12:08:10.970673 4929 generic.go:334] "Generic (PLEG): container finished" podID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerID="736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484" exitCode=0 Oct 02 12:08:10 crc kubenswrapper[4929]: I1002 12:08:10.970737 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerDied","Data":"736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484"} Oct 02 12:08:10 crc kubenswrapper[4929]: I1002 12:08:10.970793 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerStarted","Data":"481229778b475b6d8cfd1d481178d75fd8ba525c3392439e25e56693b33072f3"} Oct 02 12:08:10 crc kubenswrapper[4929]: I1002 12:08:10.977316 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerStarted","Data":"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb"} Oct 02 12:08:11 crc kubenswrapper[4929]: I1002 12:08:11.985948 4929 generic.go:334] "Generic (PLEG): container finished" podID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerID="77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb" exitCode=0 Oct 02 12:08:11 crc kubenswrapper[4929]: I1002 12:08:11.986166 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerDied","Data":"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb"} Oct 02 12:08:11 crc kubenswrapper[4929]: I1002 12:08:11.990012 4929 generic.go:334] "Generic (PLEG): container finished" podID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerID="d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31" exitCode=0 Oct 02 12:08:11 crc kubenswrapper[4929]: I1002 12:08:11.990053 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerDied","Data":"d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31"} Oct 02 12:08:12 crc kubenswrapper[4929]: I1002 12:08:12.997702 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerStarted","Data":"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485"} Oct 02 12:08:13 crc kubenswrapper[4929]: I1002 12:08:13.000208 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerStarted","Data":"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa"} Oct 02 12:08:13 crc kubenswrapper[4929]: I1002 12:08:13.021371 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2khqx" podStartSLOduration=2.414049528 podStartE2EDuration="5.021350038s" podCreationTimestamp="2025-10-02 12:08:08 +0000 UTC" firstStartedPulling="2025-10-02 12:08:09.96492962 +0000 UTC m=+3490.515295984" lastFinishedPulling="2025-10-02 12:08:12.57223013 +0000 UTC m=+3493.122596494" observedRunningTime="2025-10-02 12:08:13.01513958 +0000 UTC m=+3493.565505944" watchObservedRunningTime="2025-10-02 12:08:13.021350038 +0000 UTC m=+3493.571716402" Oct 02 12:08:18 crc kubenswrapper[4929]: I1002 12:08:18.802636 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:18 crc kubenswrapper[4929]: I1002 12:08:18.802923 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:18 crc kubenswrapper[4929]: I1002 12:08:18.846844 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:18 crc kubenswrapper[4929]: I1002 12:08:18.867396 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ck7lr" podStartSLOduration=8.391864981 podStartE2EDuration="9.86736886s" podCreationTimestamp="2025-10-02 12:08:09 +0000 UTC" firstStartedPulling="2025-10-02 12:08:10.972826635 +0000 UTC m=+3491.523192999" lastFinishedPulling="2025-10-02 12:08:12.448330514 +0000 UTC m=+3492.998696878" observedRunningTime="2025-10-02 12:08:13.03265417 +0000 UTC m=+3493.583020534" watchObservedRunningTime="2025-10-02 12:08:18.86736886 +0000 UTC m=+3499.417735234" Oct 02 12:08:19 crc kubenswrapper[4929]: I1002 12:08:19.093340 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:19 crc kubenswrapper[4929]: I1002 12:08:19.135150 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:19 crc kubenswrapper[4929]: I1002 12:08:19.827696 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:19 crc kubenswrapper[4929]: I1002 12:08:19.827765 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:19 crc kubenswrapper[4929]: I1002 12:08:19.877602 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:20 crc kubenswrapper[4929]: I1002 12:08:20.090756 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.062506 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2khqx" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="registry-server" containerID="cri-o://74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485" gracePeriod=2 Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.475584 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.614097 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.667447 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content\") pod \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.667555 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities\") pod \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.667601 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmvdp\" (UniqueName: \"kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp\") pod \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\" (UID: \"ebf5b837-28ac-4c69-ad4b-5350f21bc00c\") " Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.668889 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities" (OuterVolumeSpecName: "utilities") pod "ebf5b837-28ac-4c69-ad4b-5350f21bc00c" (UID: "ebf5b837-28ac-4c69-ad4b-5350f21bc00c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.672974 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp" (OuterVolumeSpecName: "kube-api-access-hmvdp") pod "ebf5b837-28ac-4c69-ad4b-5350f21bc00c" (UID: "ebf5b837-28ac-4c69-ad4b-5350f21bc00c"). InnerVolumeSpecName "kube-api-access-hmvdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.723520 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf5b837-28ac-4c69-ad4b-5350f21bc00c" (UID: "ebf5b837-28ac-4c69-ad4b-5350f21bc00c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.768678 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.768705 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmvdp\" (UniqueName: \"kubernetes.io/projected/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-kube-api-access-hmvdp\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:21 crc kubenswrapper[4929]: I1002 12:08:21.768715 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf5b837-28ac-4c69-ad4b-5350f21bc00c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.070589 4929 generic.go:334] "Generic (PLEG): container finished" podID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerID="74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485" exitCode=0 Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.070800 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ck7lr" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="registry-server" containerID="cri-o://773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa" gracePeriod=2 Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.070880 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2khqx" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.070972 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerDied","Data":"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485"} Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.070994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2khqx" event={"ID":"ebf5b837-28ac-4c69-ad4b-5350f21bc00c","Type":"ContainerDied","Data":"8dd6862e60969de1abe91a19483000bc0213b0425f9619d3e574bc6734ebc19d"} Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.071008 4929 scope.go:117] "RemoveContainer" containerID="74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.095568 4929 scope.go:117] "RemoveContainer" containerID="77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.113272 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.118128 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2khqx"] Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.172119 4929 scope.go:117] "RemoveContainer" containerID="828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.172666 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" path="/var/lib/kubelet/pods/ebf5b837-28ac-4c69-ad4b-5350f21bc00c/volumes" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.232886 4929 scope.go:117] "RemoveContainer" containerID="74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485" Oct 02 12:08:22 crc kubenswrapper[4929]: E1002 12:08:22.233862 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485\": container with ID starting with 74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485 not found: ID does not exist" containerID="74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.233907 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485"} err="failed to get container status \"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485\": rpc error: code = NotFound desc = could not find container \"74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485\": container with ID starting with 74e6cc9a968ba6f89f539b92b14ef9566ceee3a785690c49c490c6a6615f4485 not found: ID does not exist" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.233940 4929 scope.go:117] "RemoveContainer" containerID="77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb" Oct 02 12:08:22 crc kubenswrapper[4929]: E1002 12:08:22.234310 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb\": container with ID starting with 77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb not found: ID does not exist" containerID="77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.234346 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb"} err="failed to get container status \"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb\": rpc error: code = NotFound desc = could not find container \"77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb\": container with ID starting with 77012b4cf1b30ec32ef19702ce063c0269d82bb23ecf71ed76a8b94d8b1cf6cb not found: ID does not exist" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.234409 4929 scope.go:117] "RemoveContainer" containerID="828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da" Oct 02 12:08:22 crc kubenswrapper[4929]: E1002 12:08:22.234678 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da\": container with ID starting with 828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da not found: ID does not exist" containerID="828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.234708 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da"} err="failed to get container status \"828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da\": rpc error: code = NotFound desc = could not find container \"828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da\": container with ID starting with 828fbdc5d616b847e204ab8f320777c4f8885b358104fc650b854cb775d597da not found: ID does not exist" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.467273 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.576890 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities\") pod \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.577501 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content\") pod \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.577672 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87rmq\" (UniqueName: \"kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq\") pod \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\" (UID: \"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1\") " Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.578254 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities" (OuterVolumeSpecName: "utilities") pod "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" (UID: "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.580776 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq" (OuterVolumeSpecName: "kube-api-access-87rmq") pod "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" (UID: "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1"). InnerVolumeSpecName "kube-api-access-87rmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.589924 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" (UID: "cf6895b9-1431-4ff7-9fbb-7bfd34556bc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.679187 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.679415 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87rmq\" (UniqueName: \"kubernetes.io/projected/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-kube-api-access-87rmq\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:22 crc kubenswrapper[4929]: I1002 12:08:22.679476 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.080897 4929 generic.go:334] "Generic (PLEG): container finished" podID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerID="773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa" exitCode=0 Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.080993 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerDied","Data":"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa"} Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.080999 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ck7lr" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.081028 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ck7lr" event={"ID":"cf6895b9-1431-4ff7-9fbb-7bfd34556bc1","Type":"ContainerDied","Data":"481229778b475b6d8cfd1d481178d75fd8ba525c3392439e25e56693b33072f3"} Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.081055 4929 scope.go:117] "RemoveContainer" containerID="773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.096635 4929 scope.go:117] "RemoveContainer" containerID="d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.119039 4929 scope.go:117] "RemoveContainer" containerID="736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.129063 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.133905 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ck7lr"] Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.135498 4929 scope.go:117] "RemoveContainer" containerID="773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa" Oct 02 12:08:23 crc kubenswrapper[4929]: E1002 12:08:23.135824 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa\": container with ID starting with 773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa not found: ID does not exist" containerID="773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.135862 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa"} err="failed to get container status \"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa\": rpc error: code = NotFound desc = could not find container \"773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa\": container with ID starting with 773bb5d5e5540c9b68a2a8df886e14c41c0ee8b1af128d9d9cc7392f99ca34aa not found: ID does not exist" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.135891 4929 scope.go:117] "RemoveContainer" containerID="d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31" Oct 02 12:08:23 crc kubenswrapper[4929]: E1002 12:08:23.136174 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31\": container with ID starting with d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31 not found: ID does not exist" containerID="d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.136216 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31"} err="failed to get container status \"d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31\": rpc error: code = NotFound desc = could not find container \"d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31\": container with ID starting with d7382cbe5c09dbd7e134248c1f07b1b279928c3a1243a1a9cf9826d250349e31 not found: ID does not exist" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.136242 4929 scope.go:117] "RemoveContainer" containerID="736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484" Oct 02 12:08:23 crc kubenswrapper[4929]: E1002 12:08:23.136466 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484\": container with ID starting with 736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484 not found: ID does not exist" containerID="736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484" Oct 02 12:08:23 crc kubenswrapper[4929]: I1002 12:08:23.136499 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484"} err="failed to get container status \"736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484\": rpc error: code = NotFound desc = could not find container \"736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484\": container with ID starting with 736bf2951e437659caf2a64b5aeb258ae367d160bb2f432ec05905b01f370484 not found: ID does not exist" Oct 02 12:08:24 crc kubenswrapper[4929]: I1002 12:08:24.169271 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" path="/var/lib/kubelet/pods/cf6895b9-1431-4ff7-9fbb-7bfd34556bc1/volumes" Oct 02 12:09:14 crc kubenswrapper[4929]: I1002 12:09:14.737493 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:09:14 crc kubenswrapper[4929]: I1002 12:09:14.738289 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:09:44 crc kubenswrapper[4929]: I1002 12:09:44.737240 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:09:44 crc kubenswrapper[4929]: I1002 12:09:44.737947 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.737537 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.739361 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.739515 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.740305 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.740475 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7" gracePeriod=600 Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.914711 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7" exitCode=0 Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.914771 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7"} Oct 02 12:10:14 crc kubenswrapper[4929]: I1002 12:10:14.914810 4929 scope.go:117] "RemoveContainer" containerID="d122b342f6a99399e36ca9aea220ca3009f5a0ee38a8bb409e997b5f48265cbd" Oct 02 12:10:15 crc kubenswrapper[4929]: I1002 12:10:15.924840 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796"} Oct 02 12:12:44 crc kubenswrapper[4929]: I1002 12:12:44.737076 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:12:44 crc kubenswrapper[4929]: I1002 12:12:44.737724 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:13:14 crc kubenswrapper[4929]: I1002 12:13:14.736651 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:13:14 crc kubenswrapper[4929]: I1002 12:13:14.737130 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:13:44 crc kubenswrapper[4929]: I1002 12:13:44.737199 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:13:44 crc kubenswrapper[4929]: I1002 12:13:44.737698 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:13:44 crc kubenswrapper[4929]: I1002 12:13:44.737744 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:13:44 crc kubenswrapper[4929]: I1002 12:13:44.738226 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:13:44 crc kubenswrapper[4929]: I1002 12:13:44.738279 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" gracePeriod=600 Oct 02 12:13:44 crc kubenswrapper[4929]: E1002 12:13:44.855240 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:13:45 crc kubenswrapper[4929]: I1002 12:13:45.384161 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" exitCode=0 Oct 02 12:13:45 crc kubenswrapper[4929]: I1002 12:13:45.384211 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796"} Oct 02 12:13:45 crc kubenswrapper[4929]: I1002 12:13:45.384251 4929 scope.go:117] "RemoveContainer" containerID="912ccf0a100d0c6a17cdbd41ee6d5704bdccc909b7753b8466defd0f2fb7a0a7" Oct 02 12:13:45 crc kubenswrapper[4929]: I1002 12:13:45.385105 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:13:45 crc kubenswrapper[4929]: E1002 12:13:45.387132 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:13:57 crc kubenswrapper[4929]: I1002 12:13:57.156712 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:13:57 crc kubenswrapper[4929]: E1002 12:13:57.157544 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:14:12 crc kubenswrapper[4929]: I1002 12:14:12.170069 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:14:12 crc kubenswrapper[4929]: E1002 12:14:12.171981 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:14:24 crc kubenswrapper[4929]: I1002 12:14:24.157501 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:14:24 crc kubenswrapper[4929]: E1002 12:14:24.158262 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:14:39 crc kubenswrapper[4929]: I1002 12:14:39.157122 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:14:39 crc kubenswrapper[4929]: E1002 12:14:39.158289 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:14:51 crc kubenswrapper[4929]: I1002 12:14:51.157022 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:14:51 crc kubenswrapper[4929]: E1002 12:14:51.157716 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.142834 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v"] Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143865 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143879 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143892 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143898 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143909 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143916 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143929 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143935 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143952 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143969 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4929]: E1002 12:15:00.143981 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.143986 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.144121 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6895b9-1431-4ff7-9fbb-7bfd34556bc1" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.144173 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf5b837-28ac-4c69-ad4b-5350f21bc00c" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.144620 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.147441 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.148256 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.150396 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v"] Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.290998 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.291328 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfkr6\" (UniqueName: \"kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.291376 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.392929 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.393038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfkr6\" (UniqueName: \"kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.393080 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.394600 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.398662 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.408697 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfkr6\" (UniqueName: \"kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6\") pod \"collect-profiles-29323455-9pt2v\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.474179 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:00 crc kubenswrapper[4929]: I1002 12:15:00.871323 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v"] Oct 02 12:15:01 crc kubenswrapper[4929]: I1002 12:15:01.921404 4929 generic.go:334] "Generic (PLEG): container finished" podID="0b28d668-46df-4df8-b309-7d867c16dfe8" containerID="7de53dfe819b9bc511537b8f984471578530b9adc639283c59fb8c4e2f09d8a7" exitCode=0 Oct 02 12:15:01 crc kubenswrapper[4929]: I1002 12:15:01.921478 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" event={"ID":"0b28d668-46df-4df8-b309-7d867c16dfe8","Type":"ContainerDied","Data":"7de53dfe819b9bc511537b8f984471578530b9adc639283c59fb8c4e2f09d8a7"} Oct 02 12:15:01 crc kubenswrapper[4929]: I1002 12:15:01.921705 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" event={"ID":"0b28d668-46df-4df8-b309-7d867c16dfe8","Type":"ContainerStarted","Data":"f2b131e67fb2d16832fc099b132f40fc02b4c5ff8c0b0a8a3415a7aa64ebe1b9"} Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.173385 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.335392 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume\") pod \"0b28d668-46df-4df8-b309-7d867c16dfe8\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.335440 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume\") pod \"0b28d668-46df-4df8-b309-7d867c16dfe8\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.335494 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfkr6\" (UniqueName: \"kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6\") pod \"0b28d668-46df-4df8-b309-7d867c16dfe8\" (UID: \"0b28d668-46df-4df8-b309-7d867c16dfe8\") " Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.336283 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b28d668-46df-4df8-b309-7d867c16dfe8" (UID: "0b28d668-46df-4df8-b309-7d867c16dfe8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.340470 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6" (OuterVolumeSpecName: "kube-api-access-jfkr6") pod "0b28d668-46df-4df8-b309-7d867c16dfe8" (UID: "0b28d668-46df-4df8-b309-7d867c16dfe8"). InnerVolumeSpecName "kube-api-access-jfkr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.341616 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b28d668-46df-4df8-b309-7d867c16dfe8" (UID: "0b28d668-46df-4df8-b309-7d867c16dfe8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.437226 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b28d668-46df-4df8-b309-7d867c16dfe8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.437286 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b28d668-46df-4df8-b309-7d867c16dfe8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.437314 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfkr6\" (UniqueName: \"kubernetes.io/projected/0b28d668-46df-4df8-b309-7d867c16dfe8-kube-api-access-jfkr6\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.943537 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" event={"ID":"0b28d668-46df-4df8-b309-7d867c16dfe8","Type":"ContainerDied","Data":"f2b131e67fb2d16832fc099b132f40fc02b4c5ff8c0b0a8a3415a7aa64ebe1b9"} Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.943596 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2b131e67fb2d16832fc099b132f40fc02b4c5ff8c0b0a8a3415a7aa64ebe1b9" Oct 02 12:15:03 crc kubenswrapper[4929]: I1002 12:15:03.943685 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v" Oct 02 12:15:04 crc kubenswrapper[4929]: I1002 12:15:04.157343 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:15:04 crc kubenswrapper[4929]: E1002 12:15:04.157559 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:15:04 crc kubenswrapper[4929]: I1002 12:15:04.246272 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk"] Oct 02 12:15:04 crc kubenswrapper[4929]: I1002 12:15:04.253688 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-5csgk"] Oct 02 12:15:06 crc kubenswrapper[4929]: I1002 12:15:06.165161 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d427bb5-77e9-420b-aa34-52fa95ae93b1" path="/var/lib/kubelet/pods/3d427bb5-77e9-420b-aa34-52fa95ae93b1/volumes" Oct 02 12:15:14 crc kubenswrapper[4929]: I1002 12:15:14.979400 4929 scope.go:117] "RemoveContainer" containerID="cfef63239e48f4bab8a8ef42f32455bde329eae903a0cefb60fc43115a4f34f5" Oct 02 12:15:16 crc kubenswrapper[4929]: I1002 12:15:16.156785 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:15:16 crc kubenswrapper[4929]: E1002 12:15:16.157369 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:15:29 crc kubenswrapper[4929]: I1002 12:15:29.157605 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:15:29 crc kubenswrapper[4929]: E1002 12:15:29.159060 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:15:40 crc kubenswrapper[4929]: I1002 12:15:40.161670 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:15:40 crc kubenswrapper[4929]: E1002 12:15:40.162480 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:15:55 crc kubenswrapper[4929]: I1002 12:15:55.156678 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:15:55 crc kubenswrapper[4929]: E1002 12:15:55.157430 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:16:08 crc kubenswrapper[4929]: I1002 12:16:08.156736 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:16:08 crc kubenswrapper[4929]: E1002 12:16:08.157511 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.561274 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:14 crc kubenswrapper[4929]: E1002 12:16:14.562096 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b28d668-46df-4df8-b309-7d867c16dfe8" containerName="collect-profiles" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.562108 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b28d668-46df-4df8-b309-7d867c16dfe8" containerName="collect-profiles" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.562293 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b28d668-46df-4df8-b309-7d867c16dfe8" containerName="collect-profiles" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.563397 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.567944 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.586039 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.586110 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8h8\" (UniqueName: \"kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.586145 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.687426 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.687483 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8h8\" (UniqueName: \"kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.687592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.688225 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.688417 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.707891 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8h8\" (UniqueName: \"kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8\") pod \"certified-operators-dj7jm\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:14 crc kubenswrapper[4929]: I1002 12:16:14.908034 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:15 crc kubenswrapper[4929]: I1002 12:16:15.209835 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:15 crc kubenswrapper[4929]: W1002 12:16:15.216354 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ece1bd7_7619_4d49_aefb_392c1ed12d97.slice/crio-5b2a1958948e9f2b523791f673ded1407d408667d3384931b2f46ed5e6674975 WatchSource:0}: Error finding container 5b2a1958948e9f2b523791f673ded1407d408667d3384931b2f46ed5e6674975: Status 404 returned error can't find the container with id 5b2a1958948e9f2b523791f673ded1407d408667d3384931b2f46ed5e6674975 Oct 02 12:16:15 crc kubenswrapper[4929]: I1002 12:16:15.428021 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerStarted","Data":"5b2a1958948e9f2b523791f673ded1407d408667d3384931b2f46ed5e6674975"} Oct 02 12:16:16 crc kubenswrapper[4929]: I1002 12:16:16.435263 4929 generic.go:334] "Generic (PLEG): container finished" podID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerID="b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a" exitCode=0 Oct 02 12:16:16 crc kubenswrapper[4929]: I1002 12:16:16.435318 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerDied","Data":"b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a"} Oct 02 12:16:16 crc kubenswrapper[4929]: I1002 12:16:16.437237 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:16:17 crc kubenswrapper[4929]: I1002 12:16:17.450497 4929 generic.go:334] "Generic (PLEG): container finished" podID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerID="8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0" exitCode=0 Oct 02 12:16:17 crc kubenswrapper[4929]: I1002 12:16:17.450993 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerDied","Data":"8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0"} Oct 02 12:16:18 crc kubenswrapper[4929]: I1002 12:16:18.462411 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerStarted","Data":"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877"} Oct 02 12:16:19 crc kubenswrapper[4929]: I1002 12:16:19.483316 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dj7jm" podStartSLOduration=3.745715656 podStartE2EDuration="5.483298616s" podCreationTimestamp="2025-10-02 12:16:14 +0000 UTC" firstStartedPulling="2025-10-02 12:16:16.436869146 +0000 UTC m=+3976.987235510" lastFinishedPulling="2025-10-02 12:16:18.174452106 +0000 UTC m=+3978.724818470" observedRunningTime="2025-10-02 12:16:19.480820965 +0000 UTC m=+3980.031187339" watchObservedRunningTime="2025-10-02 12:16:19.483298616 +0000 UTC m=+3980.033664980" Oct 02 12:16:22 crc kubenswrapper[4929]: I1002 12:16:22.156766 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:16:22 crc kubenswrapper[4929]: E1002 12:16:22.157246 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:16:24 crc kubenswrapper[4929]: I1002 12:16:24.908516 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:24 crc kubenswrapper[4929]: I1002 12:16:24.908934 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:24 crc kubenswrapper[4929]: I1002 12:16:24.971533 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:25 crc kubenswrapper[4929]: I1002 12:16:25.553149 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:25 crc kubenswrapper[4929]: I1002 12:16:25.596910 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:27 crc kubenswrapper[4929]: I1002 12:16:27.519126 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dj7jm" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="registry-server" containerID="cri-o://1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877" gracePeriod=2 Oct 02 12:16:27 crc kubenswrapper[4929]: I1002 12:16:27.912923 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.069137 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc8h8\" (UniqueName: \"kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8\") pod \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.069220 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities\") pod \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.069248 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content\") pod \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\" (UID: \"5ece1bd7-7619-4d49-aefb-392c1ed12d97\") " Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.070838 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities" (OuterVolumeSpecName: "utilities") pod "5ece1bd7-7619-4d49-aefb-392c1ed12d97" (UID: "5ece1bd7-7619-4d49-aefb-392c1ed12d97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.074774 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8" (OuterVolumeSpecName: "kube-api-access-pc8h8") pod "5ece1bd7-7619-4d49-aefb-392c1ed12d97" (UID: "5ece1bd7-7619-4d49-aefb-392c1ed12d97"). InnerVolumeSpecName "kube-api-access-pc8h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.111381 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ece1bd7-7619-4d49-aefb-392c1ed12d97" (UID: "5ece1bd7-7619-4d49-aefb-392c1ed12d97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.171218 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.171251 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ece1bd7-7619-4d49-aefb-392c1ed12d97-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.171269 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc8h8\" (UniqueName: \"kubernetes.io/projected/5ece1bd7-7619-4d49-aefb-392c1ed12d97-kube-api-access-pc8h8\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.527318 4929 generic.go:334] "Generic (PLEG): container finished" podID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerID="1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877" exitCode=0 Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.527364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerDied","Data":"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877"} Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.527373 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7jm" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.527398 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7jm" event={"ID":"5ece1bd7-7619-4d49-aefb-392c1ed12d97","Type":"ContainerDied","Data":"5b2a1958948e9f2b523791f673ded1407d408667d3384931b2f46ed5e6674975"} Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.527417 4929 scope.go:117] "RemoveContainer" containerID="1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.549935 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.554142 4929 scope.go:117] "RemoveContainer" containerID="8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.554810 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dj7jm"] Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.574316 4929 scope.go:117] "RemoveContainer" containerID="b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.611322 4929 scope.go:117] "RemoveContainer" containerID="1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877" Oct 02 12:16:28 crc kubenswrapper[4929]: E1002 12:16:28.612017 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877\": container with ID starting with 1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877 not found: ID does not exist" containerID="1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.612057 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877"} err="failed to get container status \"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877\": rpc error: code = NotFound desc = could not find container \"1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877\": container with ID starting with 1a42f04140b7105beac2bfce4320449e167b6b8920e8403d82461cd4d6659877 not found: ID does not exist" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.612085 4929 scope.go:117] "RemoveContainer" containerID="8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0" Oct 02 12:16:28 crc kubenswrapper[4929]: E1002 12:16:28.612529 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0\": container with ID starting with 8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0 not found: ID does not exist" containerID="8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.612555 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0"} err="failed to get container status \"8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0\": rpc error: code = NotFound desc = could not find container \"8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0\": container with ID starting with 8f71639ff40068837e78b791a4e94a70056bb8f873bb883f586bc5ec2eca98c0 not found: ID does not exist" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.612570 4929 scope.go:117] "RemoveContainer" containerID="b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a" Oct 02 12:16:28 crc kubenswrapper[4929]: E1002 12:16:28.612895 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a\": container with ID starting with b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a not found: ID does not exist" containerID="b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a" Oct 02 12:16:28 crc kubenswrapper[4929]: I1002 12:16:28.612918 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a"} err="failed to get container status \"b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a\": rpc error: code = NotFound desc = could not find container \"b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a\": container with ID starting with b3ad29ce91c3dce2e9ad5258afaf61957267b14daf4a67c598df5f817d55665a not found: ID does not exist" Oct 02 12:16:30 crc kubenswrapper[4929]: I1002 12:16:30.166828 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" path="/var/lib/kubelet/pods/5ece1bd7-7619-4d49-aefb-392c1ed12d97/volumes" Oct 02 12:16:37 crc kubenswrapper[4929]: I1002 12:16:37.156134 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:16:37 crc kubenswrapper[4929]: E1002 12:16:37.156773 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:16:48 crc kubenswrapper[4929]: I1002 12:16:48.157230 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:16:48 crc kubenswrapper[4929]: E1002 12:16:48.158196 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:17:01 crc kubenswrapper[4929]: I1002 12:17:01.156032 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:17:01 crc kubenswrapper[4929]: E1002 12:17:01.156610 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:17:13 crc kubenswrapper[4929]: I1002 12:17:13.156571 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:17:13 crc kubenswrapper[4929]: E1002 12:17:13.157275 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:17:26 crc kubenswrapper[4929]: I1002 12:17:26.157104 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:17:26 crc kubenswrapper[4929]: E1002 12:17:26.157941 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:17:37 crc kubenswrapper[4929]: I1002 12:17:37.156804 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:17:37 crc kubenswrapper[4929]: E1002 12:17:37.158135 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:17:48 crc kubenswrapper[4929]: I1002 12:17:48.157001 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:17:48 crc kubenswrapper[4929]: E1002 12:17:48.157754 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:18:02 crc kubenswrapper[4929]: I1002 12:18:02.157111 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:18:02 crc kubenswrapper[4929]: E1002 12:18:02.158763 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:18:17 crc kubenswrapper[4929]: I1002 12:18:17.157013 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:18:17 crc kubenswrapper[4929]: E1002 12:18:17.157946 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:18:28 crc kubenswrapper[4929]: I1002 12:18:28.156613 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:18:28 crc kubenswrapper[4929]: E1002 12:18:28.157390 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:18:42 crc kubenswrapper[4929]: I1002 12:18:42.157061 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:18:42 crc kubenswrapper[4929]: E1002 12:18:42.157835 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.583849 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:44 crc kubenswrapper[4929]: E1002 12:18:44.584567 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="extract-utilities" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.584585 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="extract-utilities" Oct 02 12:18:44 crc kubenswrapper[4929]: E1002 12:18:44.584613 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="registry-server" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.584620 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="registry-server" Oct 02 12:18:44 crc kubenswrapper[4929]: E1002 12:18:44.584640 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="extract-content" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.584649 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="extract-content" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.584806 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ece1bd7-7619-4d49-aefb-392c1ed12d97" containerName="registry-server" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.586003 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.604902 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.690049 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.690124 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8pr\" (UniqueName: \"kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.690151 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.791971 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.792049 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8pr\" (UniqueName: \"kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.792083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.792738 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.792790 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.811244 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8pr\" (UniqueName: \"kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr\") pod \"redhat-operators-l6lr9\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:44 crc kubenswrapper[4929]: I1002 12:18:44.914949 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:45 crc kubenswrapper[4929]: I1002 12:18:45.144814 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:45 crc kubenswrapper[4929]: I1002 12:18:45.534267 4929 generic.go:334] "Generic (PLEG): container finished" podID="71c0823d-acc5-4414-b3af-51163e481841" containerID="8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab" exitCode=0 Oct 02 12:18:45 crc kubenswrapper[4929]: I1002 12:18:45.534382 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerDied","Data":"8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab"} Oct 02 12:18:45 crc kubenswrapper[4929]: I1002 12:18:45.534610 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerStarted","Data":"53f1f287919ddf207f6b8759a81f40e76ff31986188811add642c7ff3ac636f8"} Oct 02 12:18:46 crc kubenswrapper[4929]: I1002 12:18:46.547851 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerStarted","Data":"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e"} Oct 02 12:18:47 crc kubenswrapper[4929]: I1002 12:18:47.560126 4929 generic.go:334] "Generic (PLEG): container finished" podID="71c0823d-acc5-4414-b3af-51163e481841" containerID="3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e" exitCode=0 Oct 02 12:18:47 crc kubenswrapper[4929]: I1002 12:18:47.560398 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerDied","Data":"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e"} Oct 02 12:18:48 crc kubenswrapper[4929]: I1002 12:18:48.569098 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerStarted","Data":"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659"} Oct 02 12:18:48 crc kubenswrapper[4929]: I1002 12:18:48.585005 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l6lr9" podStartSLOduration=2.151554764 podStartE2EDuration="4.584982629s" podCreationTimestamp="2025-10-02 12:18:44 +0000 UTC" firstStartedPulling="2025-10-02 12:18:45.535732544 +0000 UTC m=+4126.086098908" lastFinishedPulling="2025-10-02 12:18:47.969160399 +0000 UTC m=+4128.519526773" observedRunningTime="2025-10-02 12:18:48.583758033 +0000 UTC m=+4129.134124397" watchObservedRunningTime="2025-10-02 12:18:48.584982629 +0000 UTC m=+4129.135348993" Oct 02 12:18:53 crc kubenswrapper[4929]: I1002 12:18:53.157008 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:18:53 crc kubenswrapper[4929]: I1002 12:18:53.606165 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea"} Oct 02 12:18:54 crc kubenswrapper[4929]: I1002 12:18:54.915389 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:54 crc kubenswrapper[4929]: I1002 12:18:54.915755 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.087657 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.089607 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.104392 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.153830 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.245301 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tss68\" (UniqueName: \"kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.245405 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.245468 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.346889 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tss68\" (UniqueName: \"kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.346995 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.347071 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.347688 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.347747 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.369004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tss68\" (UniqueName: \"kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68\") pod \"redhat-marketplace-swvfj\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.422821 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.690043 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:55 crc kubenswrapper[4929]: I1002 12:18:55.998055 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:18:56 crc kubenswrapper[4929]: I1002 12:18:56.635216 4929 generic.go:334] "Generic (PLEG): container finished" podID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerID="044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a" exitCode=0 Oct 02 12:18:56 crc kubenswrapper[4929]: I1002 12:18:56.635290 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerDied","Data":"044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a"} Oct 02 12:18:56 crc kubenswrapper[4929]: I1002 12:18:56.635340 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerStarted","Data":"a75b440076f90a352981ae738818dc0c44213a686a3a203ab4a929b21bf8cce4"} Oct 02 12:18:57 crc kubenswrapper[4929]: I1002 12:18:57.444087 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:57 crc kubenswrapper[4929]: I1002 12:18:57.645211 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l6lr9" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="registry-server" containerID="cri-o://6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659" gracePeriod=2 Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.550319 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.657540 4929 generic.go:334] "Generic (PLEG): container finished" podID="71c0823d-acc5-4414-b3af-51163e481841" containerID="6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659" exitCode=0 Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.657636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerDied","Data":"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659"} Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.657658 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l6lr9" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.657673 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l6lr9" event={"ID":"71c0823d-acc5-4414-b3af-51163e481841","Type":"ContainerDied","Data":"53f1f287919ddf207f6b8759a81f40e76ff31986188811add642c7ff3ac636f8"} Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.657700 4929 scope.go:117] "RemoveContainer" containerID="6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.666694 4929 generic.go:334] "Generic (PLEG): container finished" podID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerID="964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04" exitCode=0 Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.666760 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerDied","Data":"964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04"} Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.700151 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v8pr\" (UniqueName: \"kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr\") pod \"71c0823d-acc5-4414-b3af-51163e481841\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.700205 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities\") pod \"71c0823d-acc5-4414-b3af-51163e481841\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.700257 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content\") pod \"71c0823d-acc5-4414-b3af-51163e481841\" (UID: \"71c0823d-acc5-4414-b3af-51163e481841\") " Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.702170 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities" (OuterVolumeSpecName: "utilities") pod "71c0823d-acc5-4414-b3af-51163e481841" (UID: "71c0823d-acc5-4414-b3af-51163e481841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.705305 4929 scope.go:117] "RemoveContainer" containerID="3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.716792 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr" (OuterVolumeSpecName: "kube-api-access-5v8pr") pod "71c0823d-acc5-4414-b3af-51163e481841" (UID: "71c0823d-acc5-4414-b3af-51163e481841"). InnerVolumeSpecName "kube-api-access-5v8pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.741720 4929 scope.go:117] "RemoveContainer" containerID="8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.767069 4929 scope.go:117] "RemoveContainer" containerID="6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659" Oct 02 12:18:58 crc kubenswrapper[4929]: E1002 12:18:58.767686 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659\": container with ID starting with 6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659 not found: ID does not exist" containerID="6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.767884 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659"} err="failed to get container status \"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659\": rpc error: code = NotFound desc = could not find container \"6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659\": container with ID starting with 6ac635bd421a46a6bfdcff9b6bd6bcb2c312d73c658a1c40ea37e740a61e4659 not found: ID does not exist" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.768025 4929 scope.go:117] "RemoveContainer" containerID="3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e" Oct 02 12:18:58 crc kubenswrapper[4929]: E1002 12:18:58.768427 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e\": container with ID starting with 3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e not found: ID does not exist" containerID="3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.768471 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e"} err="failed to get container status \"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e\": rpc error: code = NotFound desc = could not find container \"3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e\": container with ID starting with 3f03c8e38610479089a1d5be8378060bfb90b87b21f8d27055af591e1b767c2e not found: ID does not exist" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.768498 4929 scope.go:117] "RemoveContainer" containerID="8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab" Oct 02 12:18:58 crc kubenswrapper[4929]: E1002 12:18:58.768732 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab\": container with ID starting with 8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab not found: ID does not exist" containerID="8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.769255 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab"} err="failed to get container status \"8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab\": rpc error: code = NotFound desc = could not find container \"8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab\": container with ID starting with 8b59e6e6883ee2e9147f540a4ba2ba1f6a6ad18e475d7405f68a8a515f5ec4ab not found: ID does not exist" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.794139 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c0823d-acc5-4414-b3af-51163e481841" (UID: "71c0823d-acc5-4414-b3af-51163e481841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.802879 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v8pr\" (UniqueName: \"kubernetes.io/projected/71c0823d-acc5-4414-b3af-51163e481841-kube-api-access-5v8pr\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.803049 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.803128 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c0823d-acc5-4414-b3af-51163e481841-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.993470 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:58 crc kubenswrapper[4929]: I1002 12:18:58.997836 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l6lr9"] Oct 02 12:18:59 crc kubenswrapper[4929]: I1002 12:18:59.681210 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerStarted","Data":"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d"} Oct 02 12:18:59 crc kubenswrapper[4929]: I1002 12:18:59.700743 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-swvfj" podStartSLOduration=2.117128789 podStartE2EDuration="4.700725522s" podCreationTimestamp="2025-10-02 12:18:55 +0000 UTC" firstStartedPulling="2025-10-02 12:18:56.638028653 +0000 UTC m=+4137.188395017" lastFinishedPulling="2025-10-02 12:18:59.221625366 +0000 UTC m=+4139.771991750" observedRunningTime="2025-10-02 12:18:59.695594498 +0000 UTC m=+4140.245960862" watchObservedRunningTime="2025-10-02 12:18:59.700725522 +0000 UTC m=+4140.251091886" Oct 02 12:19:00 crc kubenswrapper[4929]: I1002 12:19:00.174887 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c0823d-acc5-4414-b3af-51163e481841" path="/var/lib/kubelet/pods/71c0823d-acc5-4414-b3af-51163e481841/volumes" Oct 02 12:19:05 crc kubenswrapper[4929]: I1002 12:19:05.423940 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:05 crc kubenswrapper[4929]: I1002 12:19:05.425284 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:05 crc kubenswrapper[4929]: I1002 12:19:05.478240 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:05 crc kubenswrapper[4929]: I1002 12:19:05.779854 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:05 crc kubenswrapper[4929]: I1002 12:19:05.838857 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:19:07 crc kubenswrapper[4929]: I1002 12:19:07.737243 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-swvfj" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="registry-server" containerID="cri-o://40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d" gracePeriod=2 Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.109447 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.135019 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content\") pod \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.135061 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tss68\" (UniqueName: \"kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68\") pod \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.135081 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities\") pod \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\" (UID: \"1358ea8a-bd79-4745-8f94-6b9e24ed6b26\") " Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.136143 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities" (OuterVolumeSpecName: "utilities") pod "1358ea8a-bd79-4745-8f94-6b9e24ed6b26" (UID: "1358ea8a-bd79-4745-8f94-6b9e24ed6b26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.140890 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68" (OuterVolumeSpecName: "kube-api-access-tss68") pod "1358ea8a-bd79-4745-8f94-6b9e24ed6b26" (UID: "1358ea8a-bd79-4745-8f94-6b9e24ed6b26"). InnerVolumeSpecName "kube-api-access-tss68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.152276 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1358ea8a-bd79-4745-8f94-6b9e24ed6b26" (UID: "1358ea8a-bd79-4745-8f94-6b9e24ed6b26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.236502 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.236565 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tss68\" (UniqueName: \"kubernetes.io/projected/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-kube-api-access-tss68\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.236584 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1358ea8a-bd79-4745-8f94-6b9e24ed6b26-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.751096 4929 generic.go:334] "Generic (PLEG): container finished" podID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerID="40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d" exitCode=0 Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.751143 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerDied","Data":"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d"} Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.751175 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-swvfj" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.751202 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-swvfj" event={"ID":"1358ea8a-bd79-4745-8f94-6b9e24ed6b26","Type":"ContainerDied","Data":"a75b440076f90a352981ae738818dc0c44213a686a3a203ab4a929b21bf8cce4"} Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.751222 4929 scope.go:117] "RemoveContainer" containerID="40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.771931 4929 scope.go:117] "RemoveContainer" containerID="964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.775574 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.779676 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-swvfj"] Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.788695 4929 scope.go:117] "RemoveContainer" containerID="044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.826105 4929 scope.go:117] "RemoveContainer" containerID="40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d" Oct 02 12:19:08 crc kubenswrapper[4929]: E1002 12:19:08.826603 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d\": container with ID starting with 40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d not found: ID does not exist" containerID="40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.826639 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d"} err="failed to get container status \"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d\": rpc error: code = NotFound desc = could not find container \"40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d\": container with ID starting with 40cc783aeac54bfb87a587e2d8582a75dd6931eff212af3c079cfcfc22e09d7d not found: ID does not exist" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.826663 4929 scope.go:117] "RemoveContainer" containerID="964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04" Oct 02 12:19:08 crc kubenswrapper[4929]: E1002 12:19:08.826980 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04\": container with ID starting with 964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04 not found: ID does not exist" containerID="964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.827019 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04"} err="failed to get container status \"964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04\": rpc error: code = NotFound desc = could not find container \"964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04\": container with ID starting with 964dfeacc50674ad6c41e63c3a00fd807371eef83fe96e42d4526132514f2c04 not found: ID does not exist" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.827047 4929 scope.go:117] "RemoveContainer" containerID="044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a" Oct 02 12:19:08 crc kubenswrapper[4929]: E1002 12:19:08.827413 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a\": container with ID starting with 044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a not found: ID does not exist" containerID="044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a" Oct 02 12:19:08 crc kubenswrapper[4929]: I1002 12:19:08.827437 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a"} err="failed to get container status \"044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a\": rpc error: code = NotFound desc = could not find container \"044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a\": container with ID starting with 044eae4213ca4606ccdb0cfa492ad8158e38f3520e98716af81c7e326c22121a not found: ID does not exist" Oct 02 12:19:10 crc kubenswrapper[4929]: I1002 12:19:10.165167 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" path="/var/lib/kubelet/pods/1358ea8a-bd79-4745-8f94-6b9e24ed6b26/volumes" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.648909 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650046 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650069 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650100 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650111 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650140 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="extract-content" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650152 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="extract-content" Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650181 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="extract-utilities" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650193 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="extract-utilities" Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650210 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="extract-content" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650221 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="extract-content" Oct 02 12:19:32 crc kubenswrapper[4929]: E1002 12:19:32.650245 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="extract-utilities" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650255 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="extract-utilities" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650493 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c0823d-acc5-4414-b3af-51163e481841" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.650514 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1358ea8a-bd79-4745-8f94-6b9e24ed6b26" containerName="registry-server" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.653768 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.658306 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.788613 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.788710 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.788769 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8b4k\" (UniqueName: \"kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.889667 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.889723 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.889790 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8b4k\" (UniqueName: \"kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.890173 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.890245 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.910099 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8b4k\" (UniqueName: \"kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k\") pod \"community-operators-bpqbl\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:32 crc kubenswrapper[4929]: I1002 12:19:32.972172 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:33 crc kubenswrapper[4929]: I1002 12:19:33.442560 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:33 crc kubenswrapper[4929]: I1002 12:19:33.975058 4929 generic.go:334] "Generic (PLEG): container finished" podID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerID="957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471" exitCode=0 Oct 02 12:19:33 crc kubenswrapper[4929]: I1002 12:19:33.975114 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerDied","Data":"957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471"} Oct 02 12:19:33 crc kubenswrapper[4929]: I1002 12:19:33.975334 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerStarted","Data":"402cbaee7e629afdec00a502f441f18c3dbb50fc5dec1e70c64eff877ae53e7b"} Oct 02 12:19:35 crc kubenswrapper[4929]: I1002 12:19:35.990561 4929 generic.go:334] "Generic (PLEG): container finished" podID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerID="7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34" exitCode=0 Oct 02 12:19:35 crc kubenswrapper[4929]: I1002 12:19:35.990655 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerDied","Data":"7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34"} Oct 02 12:19:37 crc kubenswrapper[4929]: I1002 12:19:37.011380 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerStarted","Data":"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6"} Oct 02 12:19:37 crc kubenswrapper[4929]: I1002 12:19:37.041087 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpqbl" podStartSLOduration=2.518699612 podStartE2EDuration="5.041054402s" podCreationTimestamp="2025-10-02 12:19:32 +0000 UTC" firstStartedPulling="2025-10-02 12:19:33.976794116 +0000 UTC m=+4174.527160480" lastFinishedPulling="2025-10-02 12:19:36.499148906 +0000 UTC m=+4177.049515270" observedRunningTime="2025-10-02 12:19:37.03664101 +0000 UTC m=+4177.587007374" watchObservedRunningTime="2025-10-02 12:19:37.041054402 +0000 UTC m=+4177.591420766" Oct 02 12:19:42 crc kubenswrapper[4929]: I1002 12:19:42.973136 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:42 crc kubenswrapper[4929]: I1002 12:19:42.973620 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:43 crc kubenswrapper[4929]: I1002 12:19:43.013525 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:43 crc kubenswrapper[4929]: I1002 12:19:43.095456 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:44 crc kubenswrapper[4929]: I1002 12:19:44.630204 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.069864 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bpqbl" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="registry-server" containerID="cri-o://05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6" gracePeriod=2 Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.467713 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.570067 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8b4k\" (UniqueName: \"kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k\") pod \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.570222 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content\") pod \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.570305 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities\") pod \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\" (UID: \"a6e9d74b-0f37-404e-8090-2eba93a5a2ee\") " Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.571329 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities" (OuterVolumeSpecName: "utilities") pod "a6e9d74b-0f37-404e-8090-2eba93a5a2ee" (UID: "a6e9d74b-0f37-404e-8090-2eba93a5a2ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.571715 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.575664 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k" (OuterVolumeSpecName: "kube-api-access-z8b4k") pod "a6e9d74b-0f37-404e-8090-2eba93a5a2ee" (UID: "a6e9d74b-0f37-404e-8090-2eba93a5a2ee"). InnerVolumeSpecName "kube-api-access-z8b4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:19:45 crc kubenswrapper[4929]: I1002 12:19:45.673093 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8b4k\" (UniqueName: \"kubernetes.io/projected/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-kube-api-access-z8b4k\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.078727 4929 generic.go:334] "Generic (PLEG): container finished" podID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerID="05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6" exitCode=0 Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.078774 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerDied","Data":"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6"} Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.078803 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpqbl" event={"ID":"a6e9d74b-0f37-404e-8090-2eba93a5a2ee","Type":"ContainerDied","Data":"402cbaee7e629afdec00a502f441f18c3dbb50fc5dec1e70c64eff877ae53e7b"} Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.078820 4929 scope.go:117] "RemoveContainer" containerID="05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.079030 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpqbl" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.102652 4929 scope.go:117] "RemoveContainer" containerID="7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.112296 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e9d74b-0f37-404e-8090-2eba93a5a2ee" (UID: "a6e9d74b-0f37-404e-8090-2eba93a5a2ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.126188 4929 scope.go:117] "RemoveContainer" containerID="957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.175242 4929 scope.go:117] "RemoveContainer" containerID="05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6" Oct 02 12:19:46 crc kubenswrapper[4929]: E1002 12:19:46.175781 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6\": container with ID starting with 05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6 not found: ID does not exist" containerID="05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.175848 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6"} err="failed to get container status \"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6\": rpc error: code = NotFound desc = could not find container \"05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6\": container with ID starting with 05a2ff442f81e99fa5ce5a43bdd4ee6cef0f71dac3b75914b683605a0a6d4eb6 not found: ID does not exist" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.175889 4929 scope.go:117] "RemoveContainer" containerID="7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34" Oct 02 12:19:46 crc kubenswrapper[4929]: E1002 12:19:46.176572 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34\": container with ID starting with 7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34 not found: ID does not exist" containerID="7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.176608 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34"} err="failed to get container status \"7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34\": rpc error: code = NotFound desc = could not find container \"7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34\": container with ID starting with 7b3741d12162c97361b8d8f1b6f39b1f683b859ff8e36634d133ceee75b96c34 not found: ID does not exist" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.176630 4929 scope.go:117] "RemoveContainer" containerID="957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471" Oct 02 12:19:46 crc kubenswrapper[4929]: E1002 12:19:46.176917 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471\": container with ID starting with 957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471 not found: ID does not exist" containerID="957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.176941 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471"} err="failed to get container status \"957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471\": rpc error: code = NotFound desc = could not find container \"957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471\": container with ID starting with 957d06b8532a536a61dd4cf30ca58e34259ad0b29e6d91d9f586d2d623439471 not found: ID does not exist" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.182289 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e9d74b-0f37-404e-8090-2eba93a5a2ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.403749 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:46 crc kubenswrapper[4929]: I1002 12:19:46.409880 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bpqbl"] Oct 02 12:19:48 crc kubenswrapper[4929]: I1002 12:19:48.165440 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" path="/var/lib/kubelet/pods/a6e9d74b-0f37-404e-8090-2eba93a5a2ee/volumes" Oct 02 12:21:14 crc kubenswrapper[4929]: I1002 12:21:14.736600 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:21:14 crc kubenswrapper[4929]: I1002 12:21:14.737228 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:21:44 crc kubenswrapper[4929]: I1002 12:21:44.737287 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:21:44 crc kubenswrapper[4929]: I1002 12:21:44.737836 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:22:14 crc kubenswrapper[4929]: I1002 12:22:14.736682 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:22:14 crc kubenswrapper[4929]: I1002 12:22:14.737203 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:22:14 crc kubenswrapper[4929]: I1002 12:22:14.737245 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:22:14 crc kubenswrapper[4929]: I1002 12:22:14.737722 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:22:14 crc kubenswrapper[4929]: I1002 12:22:14.737780 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea" gracePeriod=600 Oct 02 12:22:15 crc kubenswrapper[4929]: I1002 12:22:15.228161 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea" exitCode=0 Oct 02 12:22:15 crc kubenswrapper[4929]: I1002 12:22:15.228341 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea"} Oct 02 12:22:15 crc kubenswrapper[4929]: I1002 12:22:15.228536 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8"} Oct 02 12:22:15 crc kubenswrapper[4929]: I1002 12:22:15.228561 4929 scope.go:117] "RemoveContainer" containerID="9fb412bcf4facbe9f3d025f3067ccbf43c45b76c38d39ee450f59d40bc306796" Oct 02 12:24:44 crc kubenswrapper[4929]: I1002 12:24:44.737464 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:24:44 crc kubenswrapper[4929]: I1002 12:24:44.738188 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:25:14 crc kubenswrapper[4929]: I1002 12:25:14.737217 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:25:14 crc kubenswrapper[4929]: I1002 12:25:14.737712 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:25:43 crc kubenswrapper[4929]: I1002 12:25:43.909482 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-2gmbm"] Oct 02 12:25:43 crc kubenswrapper[4929]: I1002 12:25:43.913913 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-2gmbm"] Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.026161 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-nscf6"] Oct 02 12:25:44 crc kubenswrapper[4929]: E1002 12:25:44.026444 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="extract-utilities" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.026457 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="extract-utilities" Oct 02 12:25:44 crc kubenswrapper[4929]: E1002 12:25:44.026486 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.026492 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4929]: E1002 12:25:44.026516 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="extract-content" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.026522 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="extract-content" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.026643 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e9d74b-0f37-404e-8090-2eba93a5a2ee" containerName="registry-server" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.027121 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.029272 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.029436 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.030400 4929 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-2cfdb" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.034561 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.041783 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nscf6"] Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.166267 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816c4698-4b96-4335-9661-ce6f6031fb6c" path="/var/lib/kubelet/pods/816c4698-4b96-4335-9661-ce6f6031fb6c/volumes" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.191535 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.192010 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.192154 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqw9h\" (UniqueName: \"kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.293930 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.294391 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.294604 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqw9h\" (UniqueName: \"kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.294764 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.294712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.319472 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqw9h\" (UniqueName: \"kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h\") pod \"crc-storage-crc-nscf6\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.350001 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.737355 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.737615 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.737658 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.738266 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.738326 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" gracePeriod=600 Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.819137 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nscf6"] Oct 02 12:25:44 crc kubenswrapper[4929]: I1002 12:25:44.836525 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:25:44 crc kubenswrapper[4929]: E1002 12:25:44.877711 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.829240 4929 generic.go:334] "Generic (PLEG): container finished" podID="287f056e-4c62-4c97-bcf4-9b839585c97f" containerID="45f6eb7291b6e2ef890e46bacd6f28605f41eb92866064098030f2dca1a66f2b" exitCode=0 Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.829323 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nscf6" event={"ID":"287f056e-4c62-4c97-bcf4-9b839585c97f","Type":"ContainerDied","Data":"45f6eb7291b6e2ef890e46bacd6f28605f41eb92866064098030f2dca1a66f2b"} Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.829354 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nscf6" event={"ID":"287f056e-4c62-4c97-bcf4-9b839585c97f","Type":"ContainerStarted","Data":"dc2dbd8b89e54397bfa43b2ae205bb6fe53cb235abff1d2865018dbdffea292e"} Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.832011 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" exitCode=0 Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.832092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8"} Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.832145 4929 scope.go:117] "RemoveContainer" containerID="25b857f254fa55aaeec7ea22ddf98e1b68381c6e0b2a9daebd266814371792ea" Oct 02 12:25:45 crc kubenswrapper[4929]: I1002 12:25:45.833545 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:25:45 crc kubenswrapper[4929]: E1002 12:25:45.833879 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.107920 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.232706 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt\") pod \"287f056e-4c62-4c97-bcf4-9b839585c97f\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.232846 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqw9h\" (UniqueName: \"kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h\") pod \"287f056e-4c62-4c97-bcf4-9b839585c97f\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.233068 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "287f056e-4c62-4c97-bcf4-9b839585c97f" (UID: "287f056e-4c62-4c97-bcf4-9b839585c97f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.233241 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage\") pod \"287f056e-4c62-4c97-bcf4-9b839585c97f\" (UID: \"287f056e-4c62-4c97-bcf4-9b839585c97f\") " Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.233505 4929 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/287f056e-4c62-4c97-bcf4-9b839585c97f-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.237409 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h" (OuterVolumeSpecName: "kube-api-access-qqw9h") pod "287f056e-4c62-4c97-bcf4-9b839585c97f" (UID: "287f056e-4c62-4c97-bcf4-9b839585c97f"). InnerVolumeSpecName "kube-api-access-qqw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.250705 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "287f056e-4c62-4c97-bcf4-9b839585c97f" (UID: "287f056e-4c62-4c97-bcf4-9b839585c97f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.334476 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqw9h\" (UniqueName: \"kubernetes.io/projected/287f056e-4c62-4c97-bcf4-9b839585c97f-kube-api-access-qqw9h\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.334523 4929 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/287f056e-4c62-4c97-bcf4-9b839585c97f-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.865691 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nscf6" event={"ID":"287f056e-4c62-4c97-bcf4-9b839585c97f","Type":"ContainerDied","Data":"dc2dbd8b89e54397bfa43b2ae205bb6fe53cb235abff1d2865018dbdffea292e"} Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.865731 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc2dbd8b89e54397bfa43b2ae205bb6fe53cb235abff1d2865018dbdffea292e" Oct 02 12:25:47 crc kubenswrapper[4929]: I1002 12:25:47.865765 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nscf6" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.531561 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-nscf6"] Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.538440 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-nscf6"] Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.703408 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5wqkt"] Oct 02 12:25:49 crc kubenswrapper[4929]: E1002 12:25:49.703756 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287f056e-4c62-4c97-bcf4-9b839585c97f" containerName="storage" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.703778 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="287f056e-4c62-4c97-bcf4-9b839585c97f" containerName="storage" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.703986 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="287f056e-4c62-4c97-bcf4-9b839585c97f" containerName="storage" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.704534 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.707193 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.707222 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.707981 4929 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-2cfdb" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.708166 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.713737 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5wqkt"] Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.865464 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.865877 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwkj8\" (UniqueName: \"kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.865927 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.966515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwkj8\" (UniqueName: \"kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.966570 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.966608 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.966848 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.967440 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:49 crc kubenswrapper[4929]: I1002 12:25:49.988703 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwkj8\" (UniqueName: \"kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8\") pod \"crc-storage-crc-5wqkt\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:50 crc kubenswrapper[4929]: I1002 12:25:50.020421 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:50 crc kubenswrapper[4929]: I1002 12:25:50.170238 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287f056e-4c62-4c97-bcf4-9b839585c97f" path="/var/lib/kubelet/pods/287f056e-4c62-4c97-bcf4-9b839585c97f/volumes" Oct 02 12:25:50 crc kubenswrapper[4929]: I1002 12:25:50.431321 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5wqkt"] Oct 02 12:25:50 crc kubenswrapper[4929]: I1002 12:25:50.889165 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5wqkt" event={"ID":"fcef8159-3e42-42e0-bef5-e96ea8ccec9c","Type":"ContainerStarted","Data":"8823e8f04f2413e47267f3a2476925c736bf1cdd8defeb7f1d3ee5efe97bb072"} Oct 02 12:25:51 crc kubenswrapper[4929]: I1002 12:25:51.897207 4929 generic.go:334] "Generic (PLEG): container finished" podID="fcef8159-3e42-42e0-bef5-e96ea8ccec9c" containerID="dc8a30f5dd7ff98f48917e8a7922127c6168bcaf62dce2f2979d54fabcc6a190" exitCode=0 Oct 02 12:25:51 crc kubenswrapper[4929]: I1002 12:25:51.897256 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5wqkt" event={"ID":"fcef8159-3e42-42e0-bef5-e96ea8ccec9c","Type":"ContainerDied","Data":"dc8a30f5dd7ff98f48917e8a7922127c6168bcaf62dce2f2979d54fabcc6a190"} Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.166985 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.307188 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt\") pod \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.307248 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage\") pod \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.307277 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwkj8\" (UniqueName: \"kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8\") pod \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\" (UID: \"fcef8159-3e42-42e0-bef5-e96ea8ccec9c\") " Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.307318 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fcef8159-3e42-42e0-bef5-e96ea8ccec9c" (UID: "fcef8159-3e42-42e0-bef5-e96ea8ccec9c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.307489 4929 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.312939 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8" (OuterVolumeSpecName: "kube-api-access-zwkj8") pod "fcef8159-3e42-42e0-bef5-e96ea8ccec9c" (UID: "fcef8159-3e42-42e0-bef5-e96ea8ccec9c"). InnerVolumeSpecName "kube-api-access-zwkj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.326953 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fcef8159-3e42-42e0-bef5-e96ea8ccec9c" (UID: "fcef8159-3e42-42e0-bef5-e96ea8ccec9c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.409049 4929 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.409096 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwkj8\" (UniqueName: \"kubernetes.io/projected/fcef8159-3e42-42e0-bef5-e96ea8ccec9c-kube-api-access-zwkj8\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.914901 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5wqkt" event={"ID":"fcef8159-3e42-42e0-bef5-e96ea8ccec9c","Type":"ContainerDied","Data":"8823e8f04f2413e47267f3a2476925c736bf1cdd8defeb7f1d3ee5efe97bb072"} Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.914943 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8823e8f04f2413e47267f3a2476925c736bf1cdd8defeb7f1d3ee5efe97bb072" Oct 02 12:25:53 crc kubenswrapper[4929]: I1002 12:25:53.915039 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5wqkt" Oct 02 12:25:57 crc kubenswrapper[4929]: I1002 12:25:57.156676 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:25:57 crc kubenswrapper[4929]: E1002 12:25:57.157196 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:26:11 crc kubenswrapper[4929]: I1002 12:26:11.157482 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:26:11 crc kubenswrapper[4929]: E1002 12:26:11.159323 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:26:15 crc kubenswrapper[4929]: I1002 12:26:15.225752 4929 scope.go:117] "RemoveContainer" containerID="f8851a997bc198fa21e046849aa89a3c55baaaad2903035da1f9eebc403707ec" Oct 02 12:26:22 crc kubenswrapper[4929]: I1002 12:26:22.156864 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:26:22 crc kubenswrapper[4929]: E1002 12:26:22.157567 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.157041 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:26:34 crc kubenswrapper[4929]: E1002 12:26:34.157607 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.362325 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fxvtg"] Oct 02 12:26:34 crc kubenswrapper[4929]: E1002 12:26:34.362647 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcef8159-3e42-42e0-bef5-e96ea8ccec9c" containerName="storage" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.362660 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcef8159-3e42-42e0-bef5-e96ea8ccec9c" containerName="storage" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.362794 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcef8159-3e42-42e0-bef5-e96ea8ccec9c" containerName="storage" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.363786 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.372160 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxvtg"] Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.505556 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-catalog-content\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.505666 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-utilities\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.505695 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdds6\" (UniqueName: \"kubernetes.io/projected/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-kube-api-access-pdds6\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.606815 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-catalog-content\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.607181 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-utilities\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.607328 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdds6\" (UniqueName: \"kubernetes.io/projected/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-kube-api-access-pdds6\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.607628 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-utilities\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.608066 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-catalog-content\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.627447 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdds6\" (UniqueName: \"kubernetes.io/projected/6abdb15b-9bdf-4ba4-aaf0-899b30be9d78-kube-api-access-pdds6\") pod \"certified-operators-fxvtg\" (UID: \"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78\") " pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.689157 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:34 crc kubenswrapper[4929]: I1002 12:26:34.949543 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxvtg"] Oct 02 12:26:35 crc kubenswrapper[4929]: I1002 12:26:35.216451 4929 generic.go:334] "Generic (PLEG): container finished" podID="6abdb15b-9bdf-4ba4-aaf0-899b30be9d78" containerID="e0ca6cd7c0e66c154baef503f08a3b2f4b9f45f8090c08db4404b1fa6af2fb64" exitCode=0 Oct 02 12:26:35 crc kubenswrapper[4929]: I1002 12:26:35.216682 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxvtg" event={"ID":"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78","Type":"ContainerDied","Data":"e0ca6cd7c0e66c154baef503f08a3b2f4b9f45f8090c08db4404b1fa6af2fb64"} Oct 02 12:26:35 crc kubenswrapper[4929]: I1002 12:26:35.217571 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxvtg" event={"ID":"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78","Type":"ContainerStarted","Data":"67a8014caadc88b9a0c58d26df672d76be68091f063e7afd1852fdfb423ae7c0"} Oct 02 12:26:39 crc kubenswrapper[4929]: I1002 12:26:39.251010 4929 generic.go:334] "Generic (PLEG): container finished" podID="6abdb15b-9bdf-4ba4-aaf0-899b30be9d78" containerID="5dac2cb187cea91170e9d7719ae1673d90ac49d1a5dc9ef1a4738f42e9d14a51" exitCode=0 Oct 02 12:26:39 crc kubenswrapper[4929]: I1002 12:26:39.251204 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxvtg" event={"ID":"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78","Type":"ContainerDied","Data":"5dac2cb187cea91170e9d7719ae1673d90ac49d1a5dc9ef1a4738f42e9d14a51"} Oct 02 12:26:40 crc kubenswrapper[4929]: I1002 12:26:40.259817 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fxvtg" event={"ID":"6abdb15b-9bdf-4ba4-aaf0-899b30be9d78","Type":"ContainerStarted","Data":"e4b903fa0647fb687d47b9c314ab2e7a8e5905ecfc271b73c25860bd522af386"} Oct 02 12:26:40 crc kubenswrapper[4929]: I1002 12:26:40.285974 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fxvtg" podStartSLOduration=1.721591016 podStartE2EDuration="6.285921826s" podCreationTimestamp="2025-10-02 12:26:34 +0000 UTC" firstStartedPulling="2025-10-02 12:26:35.218584241 +0000 UTC m=+4595.768950605" lastFinishedPulling="2025-10-02 12:26:39.782915031 +0000 UTC m=+4600.333281415" observedRunningTime="2025-10-02 12:26:40.280640344 +0000 UTC m=+4600.831006708" watchObservedRunningTime="2025-10-02 12:26:40.285921826 +0000 UTC m=+4600.836288190" Oct 02 12:26:44 crc kubenswrapper[4929]: I1002 12:26:44.690408 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:44 crc kubenswrapper[4929]: I1002 12:26:44.690921 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:44 crc kubenswrapper[4929]: I1002 12:26:44.739465 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.156402 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:26:45 crc kubenswrapper[4929]: E1002 12:26:45.156847 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.338723 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fxvtg" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.413487 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fxvtg"] Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.440256 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.440507 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cn5md" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="registry-server" containerID="cri-o://1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2" gracePeriod=2 Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.910101 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.981292 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities\") pod \"bb1e8738-5db8-4a58-961d-82f554c9f39b\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.981355 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lwq8\" (UniqueName: \"kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8\") pod \"bb1e8738-5db8-4a58-961d-82f554c9f39b\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.981851 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities" (OuterVolumeSpecName: "utilities") pod "bb1e8738-5db8-4a58-961d-82f554c9f39b" (UID: "bb1e8738-5db8-4a58-961d-82f554c9f39b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.982175 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content\") pod \"bb1e8738-5db8-4a58-961d-82f554c9f39b\" (UID: \"bb1e8738-5db8-4a58-961d-82f554c9f39b\") " Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.982344 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:45 crc kubenswrapper[4929]: I1002 12:26:45.988307 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8" (OuterVolumeSpecName: "kube-api-access-9lwq8") pod "bb1e8738-5db8-4a58-961d-82f554c9f39b" (UID: "bb1e8738-5db8-4a58-961d-82f554c9f39b"). InnerVolumeSpecName "kube-api-access-9lwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.035013 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb1e8738-5db8-4a58-961d-82f554c9f39b" (UID: "bb1e8738-5db8-4a58-961d-82f554c9f39b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.083163 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lwq8\" (UniqueName: \"kubernetes.io/projected/bb1e8738-5db8-4a58-961d-82f554c9f39b-kube-api-access-9lwq8\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.083198 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1e8738-5db8-4a58-961d-82f554c9f39b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.302066 4929 generic.go:334] "Generic (PLEG): container finished" podID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerID="1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2" exitCode=0 Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.302128 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cn5md" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.302152 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerDied","Data":"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2"} Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.302203 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cn5md" event={"ID":"bb1e8738-5db8-4a58-961d-82f554c9f39b","Type":"ContainerDied","Data":"07f45787636421552b86c7d1e0f367bcab23c9bb1b609295a717beddb3a20d7a"} Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.302228 4929 scope.go:117] "RemoveContainer" containerID="1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.327781 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.331138 4929 scope.go:117] "RemoveContainer" containerID="0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.339601 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cn5md"] Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.351550 4929 scope.go:117] "RemoveContainer" containerID="acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.382770 4929 scope.go:117] "RemoveContainer" containerID="1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2" Oct 02 12:26:46 crc kubenswrapper[4929]: E1002 12:26:46.383472 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2\": container with ID starting with 1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2 not found: ID does not exist" containerID="1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.383529 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2"} err="failed to get container status \"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2\": rpc error: code = NotFound desc = could not find container \"1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2\": container with ID starting with 1b2e08158bfa4abd74541ed036cdc503adcbab38247440b9f85db42c11ac68e2 not found: ID does not exist" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.383557 4929 scope.go:117] "RemoveContainer" containerID="0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0" Oct 02 12:26:46 crc kubenswrapper[4929]: E1002 12:26:46.383839 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0\": container with ID starting with 0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0 not found: ID does not exist" containerID="0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.383863 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0"} err="failed to get container status \"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0\": rpc error: code = NotFound desc = could not find container \"0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0\": container with ID starting with 0c047f8735418c5f14f9bec7581c6547c47a32292e0559b688d75a2fd79be0e0 not found: ID does not exist" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.383877 4929 scope.go:117] "RemoveContainer" containerID="acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c" Oct 02 12:26:46 crc kubenswrapper[4929]: E1002 12:26:46.384164 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c\": container with ID starting with acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c not found: ID does not exist" containerID="acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c" Oct 02 12:26:46 crc kubenswrapper[4929]: I1002 12:26:46.384188 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c"} err="failed to get container status \"acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c\": rpc error: code = NotFound desc = could not find container \"acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c\": container with ID starting with acca81f3ae70e02f07f2e66d65684f65cbbd608143df39ab087eb463d28f054c not found: ID does not exist" Oct 02 12:26:48 crc kubenswrapper[4929]: I1002 12:26:48.165688 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" path="/var/lib/kubelet/pods/bb1e8738-5db8-4a58-961d-82f554c9f39b/volumes" Oct 02 12:26:56 crc kubenswrapper[4929]: I1002 12:26:56.158008 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:26:56 crc kubenswrapper[4929]: E1002 12:26:56.158807 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:27:07 crc kubenswrapper[4929]: I1002 12:27:07.156570 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:27:07 crc kubenswrapper[4929]: E1002 12:27:07.157167 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:27:19 crc kubenswrapper[4929]: I1002 12:27:19.157400 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:27:19 crc kubenswrapper[4929]: E1002 12:27:19.158638 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:27:30 crc kubenswrapper[4929]: I1002 12:27:30.159841 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:27:30 crc kubenswrapper[4929]: E1002 12:27:30.160581 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:27:42 crc kubenswrapper[4929]: I1002 12:27:42.156675 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:27:42 crc kubenswrapper[4929]: E1002 12:27:42.157948 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:27:56 crc kubenswrapper[4929]: I1002 12:27:56.157113 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:27:56 crc kubenswrapper[4929]: E1002 12:27:56.157769 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:07 crc kubenswrapper[4929]: I1002 12:28:07.156216 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:28:07 crc kubenswrapper[4929]: E1002 12:28:07.156894 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:22 crc kubenswrapper[4929]: I1002 12:28:22.156594 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:28:22 crc kubenswrapper[4929]: E1002 12:28:22.157484 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:33 crc kubenswrapper[4929]: I1002 12:28:33.156565 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:28:33 crc kubenswrapper[4929]: E1002 12:28:33.157312 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:44 crc kubenswrapper[4929]: I1002 12:28:44.156670 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:28:44 crc kubenswrapper[4929]: E1002 12:28:44.157329 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.790230 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:28:54 crc kubenswrapper[4929]: E1002 12:28:54.791086 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="extract-utilities" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.791099 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="extract-utilities" Oct 02 12:28:54 crc kubenswrapper[4929]: E1002 12:28:54.791110 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="extract-content" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.791184 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="extract-content" Oct 02 12:28:54 crc kubenswrapper[4929]: E1002 12:28:54.791193 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="registry-server" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.791223 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="registry-server" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.791438 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1e8738-5db8-4a58-961d-82f554c9f39b" containerName="registry-server" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.792541 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.795162 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-gbjk9" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.800889 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.800938 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.801097 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.801329 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.805269 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.840175 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcppv\" (UniqueName: \"kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.840232 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.840277 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.941489 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcppv\" (UniqueName: \"kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.941560 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.941606 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.942716 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.942780 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:54 crc kubenswrapper[4929]: I1002 12:28:54.966986 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcppv\" (UniqueName: \"kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv\") pod \"dnsmasq-dns-5d7b5456f5-q625q\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.049924 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.051075 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.063878 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.115528 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.147015 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.147471 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjllq\" (UniqueName: \"kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.147528 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.249178 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.249340 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjllq\" (UniqueName: \"kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.249380 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.250842 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.252557 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.283949 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjllq\" (UniqueName: \"kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq\") pod \"dnsmasq-dns-98ddfc8f-x2dhw\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.372059 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.620849 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.836000 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:28:55 crc kubenswrapper[4929]: W1002 12:28:55.871987 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod312f7cef_f108_4b04_a009_32e0303c16b9.slice/crio-c66ef47a030efd785af06744dbf1e482759a78d525411c1968eccb9f1d6eac72 WatchSource:0}: Error finding container c66ef47a030efd785af06744dbf1e482759a78d525411c1968eccb9f1d6eac72: Status 404 returned error can't find the container with id c66ef47a030efd785af06744dbf1e482759a78d525411c1968eccb9f1d6eac72 Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.938354 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.939828 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.942513 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.942548 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.942614 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.942637 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ttfj8" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.942682 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.956910 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958125 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958258 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958389 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958482 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958614 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958720 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.958869 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.959022 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6559\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:55 crc kubenswrapper[4929]: I1002 12:28:55.957054 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.059984 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060020 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060053 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060079 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6559\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060101 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060136 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060161 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060180 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060197 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.060878 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.061210 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.061547 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.062707 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.062891 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.062921 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1744e4e3ac54f5e4179690cf1074bae9cd765942657cd19f77f4eea8bc18c758/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.064970 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.065025 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.072362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.076688 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6559\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.098601 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.227069 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.228179 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.230359 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.230376 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.230731 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.231230 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.232054 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4fgqj" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.245770 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.270020 4929 generic.go:334] "Generic (PLEG): container finished" podID="312f7cef-f108-4b04-a009-32e0303c16b9" containerID="994f6ae895d895365305ce2b2517f3c8079d463c9a46d37e48fbb292997351b6" exitCode=0 Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.270138 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" event={"ID":"312f7cef-f108-4b04-a009-32e0303c16b9","Type":"ContainerDied","Data":"994f6ae895d895365305ce2b2517f3c8079d463c9a46d37e48fbb292997351b6"} Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.270172 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" event={"ID":"312f7cef-f108-4b04-a009-32e0303c16b9","Type":"ContainerStarted","Data":"c66ef47a030efd785af06744dbf1e482759a78d525411c1968eccb9f1d6eac72"} Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.273101 4929 generic.go:334] "Generic (PLEG): container finished" podID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerID="6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471" exitCode=0 Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.273152 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" event={"ID":"2f61a918-28e7-4df4-b84c-19728acf4ef5","Type":"ContainerDied","Data":"6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471"} Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.273181 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" event={"ID":"2f61a918-28e7-4df4-b84c-19728acf4ef5","Type":"ContainerStarted","Data":"9927b691af9d8e60c9f7f8ae00e48dbe9a2f61ebc3be84172987c99756bcf417"} Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.322687 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.366921 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367033 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367082 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367146 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367176 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzz9\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367220 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367282 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367317 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.367349 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.468945 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469311 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469367 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469388 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzz9\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469420 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469470 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469486 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469509 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469565 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.469811 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.470573 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.470645 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.470700 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.471911 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.471941 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29c6a0ce90f8e2fd6fc75870c9f9a54b23e28d5e81065606be22fd560126963f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.472388 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.472502 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.473685 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.491739 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzz9\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.507631 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.571022 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.763322 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: W1002 12:28:56.777045 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5359c6db_3207_412b_a6b6_2fb792dacd55.slice/crio-e1fa4848f1c07cdfb5ecb805e1bd3ebbc5e14e4462b5a757ec2772911a12d922 WatchSource:0}: Error finding container e1fa4848f1c07cdfb5ecb805e1bd3ebbc5e14e4462b5a757ec2772911a12d922: Status 404 returned error can't find the container with id e1fa4848f1c07cdfb5ecb805e1bd3ebbc5e14e4462b5a757ec2772911a12d922 Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.862015 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.862944 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.869289 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.869472 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vnhql" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.874774 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.980408 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kolla-config\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.980571 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfqw\" (UniqueName: \"kubernetes.io/projected/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kube-api-access-xlfqw\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:56 crc kubenswrapper[4929]: I1002 12:28:56.980647 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-config-data\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.068523 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:28:57 crc kubenswrapper[4929]: W1002 12:28:57.077281 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3659a98f_4d3d_4d11_accb_82cb65ebff64.slice/crio-17b6165508856bd2c1845a646f6baaebba78185ca4ef64053bde9e6574b73275 WatchSource:0}: Error finding container 17b6165508856bd2c1845a646f6baaebba78185ca4ef64053bde9e6574b73275: Status 404 returned error can't find the container with id 17b6165508856bd2c1845a646f6baaebba78185ca4ef64053bde9e6574b73275 Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.081654 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-config-data\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.081946 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kolla-config\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.082040 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfqw\" (UniqueName: \"kubernetes.io/projected/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kube-api-access-xlfqw\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.083120 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-config-data\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.083712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kolla-config\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.117617 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfqw\" (UniqueName: \"kubernetes.io/projected/ce6475ed-02e6-4207-aa71-b887b2c53b8d-kube-api-access-xlfqw\") pod \"memcached-0\" (UID: \"ce6475ed-02e6-4207-aa71-b887b2c53b8d\") " pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.204542 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.290269 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerStarted","Data":"17b6165508856bd2c1845a646f6baaebba78185ca4ef64053bde9e6574b73275"} Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.302549 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" event={"ID":"312f7cef-f108-4b04-a009-32e0303c16b9","Type":"ContainerStarted","Data":"fb2f9850195c10ad70a0a03b738f0ece3394c3b293b696b05ab2a75f25c3c9da"} Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.302745 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.304163 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerStarted","Data":"e1fa4848f1c07cdfb5ecb805e1bd3ebbc5e14e4462b5a757ec2772911a12d922"} Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.308442 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" event={"ID":"2f61a918-28e7-4df4-b84c-19728acf4ef5","Type":"ContainerStarted","Data":"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd"} Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.308618 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.324950 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" podStartSLOduration=2.324931514 podStartE2EDuration="2.324931514s" podCreationTimestamp="2025-10-02 12:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:57.321064162 +0000 UTC m=+4737.871430536" watchObservedRunningTime="2025-10-02 12:28:57.324931514 +0000 UTC m=+4737.875297878" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.346125 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" podStartSLOduration=3.346105394 podStartE2EDuration="3.346105394s" podCreationTimestamp="2025-10-02 12:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:57.338703561 +0000 UTC m=+4737.889069925" watchObservedRunningTime="2025-10-02 12:28:57.346105394 +0000 UTC m=+4737.896471768" Oct 02 12:28:57 crc kubenswrapper[4929]: I1002 12:28:57.628887 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.156798 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:28:58 crc kubenswrapper[4929]: E1002 12:28:58.157383 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.212403 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.214623 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.218399 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.219815 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.220176 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.220487 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.220819 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g2q9q" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.225810 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.234988 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.267845 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.269394 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.271701 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qtdch" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.271701 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.272192 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.274677 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.284863 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.304927 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdk4w\" (UniqueName: \"kubernetes.io/projected/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kube-api-access-zdk4w\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305069 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305112 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305133 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305150 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-secrets\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305234 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305451 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305475 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.305518 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.324915 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ce6475ed-02e6-4207-aa71-b887b2c53b8d","Type":"ContainerStarted","Data":"dab3343b03bdea1bb3da84b886041ba78c5414073d188fab64fcb69a928d6dc2"} Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.324974 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ce6475ed-02e6-4207-aa71-b887b2c53b8d","Type":"ContainerStarted","Data":"02fe67cff52075e0a1599a5e2ebeff1529bb9afed6f023a8f85b9aaebf2ca2fa"} Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.325114 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.326575 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerStarted","Data":"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650"} Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.343760 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.343742542 podStartE2EDuration="2.343742542s" podCreationTimestamp="2025-10-02 12:28:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:58.338187052 +0000 UTC m=+4738.888553406" watchObservedRunningTime="2025-10-02 12:28:58.343742542 +0000 UTC m=+4738.894108906" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407475 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407524 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407557 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk86b\" (UniqueName: \"kubernetes.io/projected/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kube-api-access-wk86b\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407618 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407652 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.407717 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdk4w\" (UniqueName: \"kubernetes.io/projected/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kube-api-access-zdk4w\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408067 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408145 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408200 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-secrets\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408251 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408300 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408431 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408462 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408497 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408524 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408600 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408697 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.408906 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.409319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.409564 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.409751 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4a722cb-85e5-4bcd-8acd-101708d08d0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.412336 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.412375 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa8cf1d8754882b8be83ab360dcc99cf72ba7ab3fc502dd0d4b6337fa7a1d0b0/globalmount\"" pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.412800 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-secrets\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.413185 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.424538 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a722cb-85e5-4bcd-8acd-101708d08d0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.427756 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdk4w\" (UniqueName: \"kubernetes.io/projected/f4a722cb-85e5-4bcd-8acd-101708d08d0e-kube-api-access-zdk4w\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.438188 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1eefdc90-44b2-4a60-8b3a-006538f46304\") pod \"openstack-galera-0\" (UID: \"f4a722cb-85e5-4bcd-8acd-101708d08d0e\") " pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510464 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510509 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510529 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510546 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510584 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510602 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510623 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk86b\" (UniqueName: \"kubernetes.io/projected/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kube-api-access-wk86b\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510641 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.510933 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.511532 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.512047 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.512420 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.514544 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.515598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.515777 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.515896 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21883b8f-1b4a-4eb8-9f9f-48047745f86f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.515803 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3662ad4d66413efce8916b5f10115ab40cb648e8fc1c840f8b5392769e5ac7f5/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.528500 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk86b\" (UniqueName: \"kubernetes.io/projected/21883b8f-1b4a-4eb8-9f9f-48047745f86f-kube-api-access-wk86b\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.542878 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.554796 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-23aa75d7-90a0-4c30-8a8f-49f63b11ced5\") pod \"openstack-cell1-galera-0\" (UID: \"21883b8f-1b4a-4eb8-9f9f-48047745f86f\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.584848 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 12:28:58 crc kubenswrapper[4929]: I1002 12:28:58.988910 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:28:58 crc kubenswrapper[4929]: W1002 12:28:58.993457 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4a722cb_85e5_4bcd_8acd_101708d08d0e.slice/crio-2f0fe977f218bacfd0e2122f38ff102bc0125a479e2815060013943195b52a53 WatchSource:0}: Error finding container 2f0fe977f218bacfd0e2122f38ff102bc0125a479e2815060013943195b52a53: Status 404 returned error can't find the container with id 2f0fe977f218bacfd0e2122f38ff102bc0125a479e2815060013943195b52a53 Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.050181 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:28:59 crc kubenswrapper[4929]: W1002 12:28:59.053869 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21883b8f_1b4a_4eb8_9f9f_48047745f86f.slice/crio-545f80a15a2362a63d049336ec4f139e0576ea9c07e853eeb099950d3989cbb6 WatchSource:0}: Error finding container 545f80a15a2362a63d049336ec4f139e0576ea9c07e853eeb099950d3989cbb6: Status 404 returned error can't find the container with id 545f80a15a2362a63d049336ec4f139e0576ea9c07e853eeb099950d3989cbb6 Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.336284 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f4a722cb-85e5-4bcd-8acd-101708d08d0e","Type":"ContainerStarted","Data":"1e607270095ebf8fd48228cddc9b9866ea3f6b075ee4b2e8e7b45e9f6588ef75"} Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.336355 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f4a722cb-85e5-4bcd-8acd-101708d08d0e","Type":"ContainerStarted","Data":"2f0fe977f218bacfd0e2122f38ff102bc0125a479e2815060013943195b52a53"} Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.339612 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerStarted","Data":"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8"} Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.341292 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21883b8f-1b4a-4eb8-9f9f-48047745f86f","Type":"ContainerStarted","Data":"b6a71e43dd749a5325706315707a1353172972faaaf784004d7f55eee4003f4b"} Oct 02 12:28:59 crc kubenswrapper[4929]: I1002 12:28:59.341325 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21883b8f-1b4a-4eb8-9f9f-48047745f86f","Type":"ContainerStarted","Data":"545f80a15a2362a63d049336ec4f139e0576ea9c07e853eeb099950d3989cbb6"} Oct 02 12:29:02 crc kubenswrapper[4929]: I1002 12:29:02.206140 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 12:29:03 crc kubenswrapper[4929]: I1002 12:29:03.382827 4929 generic.go:334] "Generic (PLEG): container finished" podID="21883b8f-1b4a-4eb8-9f9f-48047745f86f" containerID="b6a71e43dd749a5325706315707a1353172972faaaf784004d7f55eee4003f4b" exitCode=0 Oct 02 12:29:03 crc kubenswrapper[4929]: I1002 12:29:03.382908 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21883b8f-1b4a-4eb8-9f9f-48047745f86f","Type":"ContainerDied","Data":"b6a71e43dd749a5325706315707a1353172972faaaf784004d7f55eee4003f4b"} Oct 02 12:29:03 crc kubenswrapper[4929]: I1002 12:29:03.386882 4929 generic.go:334] "Generic (PLEG): container finished" podID="f4a722cb-85e5-4bcd-8acd-101708d08d0e" containerID="1e607270095ebf8fd48228cddc9b9866ea3f6b075ee4b2e8e7b45e9f6588ef75" exitCode=0 Oct 02 12:29:03 crc kubenswrapper[4929]: I1002 12:29:03.386921 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f4a722cb-85e5-4bcd-8acd-101708d08d0e","Type":"ContainerDied","Data":"1e607270095ebf8fd48228cddc9b9866ea3f6b075ee4b2e8e7b45e9f6588ef75"} Oct 02 12:29:04 crc kubenswrapper[4929]: I1002 12:29:04.394813 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"21883b8f-1b4a-4eb8-9f9f-48047745f86f","Type":"ContainerStarted","Data":"c75c6cf96ac7d32e0ea3e77762fdc62b43073caafdc59cdacce4ff9621f5a8f1"} Oct 02 12:29:04 crc kubenswrapper[4929]: I1002 12:29:04.397263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f4a722cb-85e5-4bcd-8acd-101708d08d0e","Type":"ContainerStarted","Data":"08e9ea5fa85e10d7334d2396acc6d961729aae54ae79056d0362a84fde56b6cb"} Oct 02 12:29:04 crc kubenswrapper[4929]: I1002 12:29:04.416576 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.416556913 podStartE2EDuration="7.416556913s" podCreationTimestamp="2025-10-02 12:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:04.413259248 +0000 UTC m=+4744.963625672" watchObservedRunningTime="2025-10-02 12:29:04.416556913 +0000 UTC m=+4744.966923277" Oct 02 12:29:04 crc kubenswrapper[4929]: I1002 12:29:04.435319 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.435293313 podStartE2EDuration="7.435293313s" podCreationTimestamp="2025-10-02 12:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:04.431093382 +0000 UTC m=+4744.981459756" watchObservedRunningTime="2025-10-02 12:29:04.435293313 +0000 UTC m=+4744.985659687" Oct 02 12:29:05 crc kubenswrapper[4929]: I1002 12:29:05.117705 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:29:05 crc kubenswrapper[4929]: I1002 12:29:05.374359 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:29:05 crc kubenswrapper[4929]: I1002 12:29:05.438158 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:29:05 crc kubenswrapper[4929]: I1002 12:29:05.438455 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="dnsmasq-dns" containerID="cri-o://0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd" gracePeriod=10 Oct 02 12:29:05 crc kubenswrapper[4929]: I1002 12:29:05.929250 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.032684 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcppv\" (UniqueName: \"kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv\") pod \"2f61a918-28e7-4df4-b84c-19728acf4ef5\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.032747 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc\") pod \"2f61a918-28e7-4df4-b84c-19728acf4ef5\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.032868 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config\") pod \"2f61a918-28e7-4df4-b84c-19728acf4ef5\" (UID: \"2f61a918-28e7-4df4-b84c-19728acf4ef5\") " Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.038269 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv" (OuterVolumeSpecName: "kube-api-access-qcppv") pod "2f61a918-28e7-4df4-b84c-19728acf4ef5" (UID: "2f61a918-28e7-4df4-b84c-19728acf4ef5"). InnerVolumeSpecName "kube-api-access-qcppv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.066288 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f61a918-28e7-4df4-b84c-19728acf4ef5" (UID: "2f61a918-28e7-4df4-b84c-19728acf4ef5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.066715 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config" (OuterVolumeSpecName: "config") pod "2f61a918-28e7-4df4-b84c-19728acf4ef5" (UID: "2f61a918-28e7-4df4-b84c-19728acf4ef5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.134630 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.134670 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcppv\" (UniqueName: \"kubernetes.io/projected/2f61a918-28e7-4df4-b84c-19728acf4ef5-kube-api-access-qcppv\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.134681 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f61a918-28e7-4df4-b84c-19728acf4ef5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.420408 4929 generic.go:334] "Generic (PLEG): container finished" podID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerID="0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd" exitCode=0 Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.420449 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" event={"ID":"2f61a918-28e7-4df4-b84c-19728acf4ef5","Type":"ContainerDied","Data":"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd"} Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.420475 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" event={"ID":"2f61a918-28e7-4df4-b84c-19728acf4ef5","Type":"ContainerDied","Data":"9927b691af9d8e60c9f7f8ae00e48dbe9a2f61ebc3be84172987c99756bcf417"} Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.420483 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-q625q" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.420492 4929 scope.go:117] "RemoveContainer" containerID="0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.438520 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.444303 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-q625q"] Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.449391 4929 scope.go:117] "RemoveContainer" containerID="6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.470207 4929 scope.go:117] "RemoveContainer" containerID="0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd" Oct 02 12:29:06 crc kubenswrapper[4929]: E1002 12:29:06.470748 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd\": container with ID starting with 0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd not found: ID does not exist" containerID="0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.470793 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd"} err="failed to get container status \"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd\": rpc error: code = NotFound desc = could not find container \"0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd\": container with ID starting with 0e92f0294de9e2d66e1af015eeb2d5b866ab798380c47b7525341325473012cd not found: ID does not exist" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.470823 4929 scope.go:117] "RemoveContainer" containerID="6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471" Oct 02 12:29:06 crc kubenswrapper[4929]: E1002 12:29:06.471286 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471\": container with ID starting with 6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471 not found: ID does not exist" containerID="6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471" Oct 02 12:29:06 crc kubenswrapper[4929]: I1002 12:29:06.471309 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471"} err="failed to get container status \"6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471\": rpc error: code = NotFound desc = could not find container \"6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471\": container with ID starting with 6a9607271cb8163c7b88fb0914a94d491de96621f25a3463bcc7cb105572f471 not found: ID does not exist" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.167575 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" path="/var/lib/kubelet/pods/2f61a918-28e7-4df4-b84c-19728acf4ef5/volumes" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.543703 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.543769 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.585241 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.585315 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 12:29:08 crc kubenswrapper[4929]: I1002 12:29:08.640507 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 12:29:09 crc kubenswrapper[4929]: E1002 12:29:09.250474 4929 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.173:45500->38.102.83.173:39349: write tcp 38.102.83.173:45500->38.102.83.173:39349: write: broken pipe Oct 02 12:29:09 crc kubenswrapper[4929]: I1002 12:29:09.494713 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 12:29:10 crc kubenswrapper[4929]: I1002 12:29:10.598638 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 12:29:10 crc kubenswrapper[4929]: I1002 12:29:10.647448 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 12:29:13 crc kubenswrapper[4929]: I1002 12:29:13.156795 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:29:13 crc kubenswrapper[4929]: E1002 12:29:13.157050 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:29:28 crc kubenswrapper[4929]: I1002 12:29:28.157102 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:29:28 crc kubenswrapper[4929]: E1002 12:29:28.158593 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:29:30 crc kubenswrapper[4929]: I1002 12:29:30.607800 4929 generic.go:334] "Generic (PLEG): container finished" podID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerID="cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650" exitCode=0 Oct 02 12:29:30 crc kubenswrapper[4929]: I1002 12:29:30.607850 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerDied","Data":"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650"} Oct 02 12:29:30 crc kubenswrapper[4929]: I1002 12:29:30.612329 4929 generic.go:334] "Generic (PLEG): container finished" podID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerID="6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8" exitCode=0 Oct 02 12:29:30 crc kubenswrapper[4929]: I1002 12:29:30.612364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerDied","Data":"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8"} Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.622644 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerStarted","Data":"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb"} Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.623167 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.627587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerStarted","Data":"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845"} Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.627848 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.673761 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.673743624 podStartE2EDuration="37.673743624s" podCreationTimestamp="2025-10-02 12:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:31.669609345 +0000 UTC m=+4772.219975719" watchObservedRunningTime="2025-10-02 12:29:31.673743624 +0000 UTC m=+4772.224109988" Oct 02 12:29:31 crc kubenswrapper[4929]: I1002 12:29:31.692904 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.692885786 podStartE2EDuration="36.692885786s" podCreationTimestamp="2025-10-02 12:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:31.686702008 +0000 UTC m=+4772.237068382" watchObservedRunningTime="2025-10-02 12:29:31.692885786 +0000 UTC m=+4772.243252150" Oct 02 12:29:41 crc kubenswrapper[4929]: I1002 12:29:41.157208 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:29:41 crc kubenswrapper[4929]: E1002 12:29:41.157893 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.887413 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:42 crc kubenswrapper[4929]: E1002 12:29:42.887805 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="init" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.887822 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="init" Oct 02 12:29:42 crc kubenswrapper[4929]: E1002 12:29:42.887860 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="dnsmasq-dns" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.887868 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="dnsmasq-dns" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.888307 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f61a918-28e7-4df4-b84c-19728acf4ef5" containerName="dnsmasq-dns" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.890209 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:42 crc kubenswrapper[4929]: I1002 12:29:42.911407 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.008315 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.008594 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5zg\" (UniqueName: \"kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.008653 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.109811 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5zg\" (UniqueName: \"kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.110189 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.110366 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.110832 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.110992 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.129465 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5zg\" (UniqueName: \"kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg\") pod \"redhat-marketplace-47jwl\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.235767 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.642509 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:43 crc kubenswrapper[4929]: W1002 12:29:43.647546 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f65ef4_9098_49b3_88bc_c171519f3a60.slice/crio-2a6b0b63f1145eff4d78c939c7b37c8a204d023b901806d636a66c7c0aab8f4f WatchSource:0}: Error finding container 2a6b0b63f1145eff4d78c939c7b37c8a204d023b901806d636a66c7c0aab8f4f: Status 404 returned error can't find the container with id 2a6b0b63f1145eff4d78c939c7b37c8a204d023b901806d636a66c7c0aab8f4f Oct 02 12:29:43 crc kubenswrapper[4929]: I1002 12:29:43.710638 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerStarted","Data":"2a6b0b63f1145eff4d78c939c7b37c8a204d023b901806d636a66c7c0aab8f4f"} Oct 02 12:29:44 crc kubenswrapper[4929]: I1002 12:29:44.723152 4929 generic.go:334] "Generic (PLEG): container finished" podID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerID="c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e" exitCode=0 Oct 02 12:29:44 crc kubenswrapper[4929]: I1002 12:29:44.723271 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerDied","Data":"c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e"} Oct 02 12:29:45 crc kubenswrapper[4929]: I1002 12:29:45.733263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerStarted","Data":"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6"} Oct 02 12:29:46 crc kubenswrapper[4929]: I1002 12:29:46.326154 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 12:29:46 crc kubenswrapper[4929]: I1002 12:29:46.572939 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:29:46 crc kubenswrapper[4929]: I1002 12:29:46.743390 4929 generic.go:334] "Generic (PLEG): container finished" podID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerID="a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6" exitCode=0 Oct 02 12:29:46 crc kubenswrapper[4929]: I1002 12:29:46.743450 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerDied","Data":"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6"} Oct 02 12:29:47 crc kubenswrapper[4929]: I1002 12:29:47.752118 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerStarted","Data":"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e"} Oct 02 12:29:47 crc kubenswrapper[4929]: I1002 12:29:47.767700 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-47jwl" podStartSLOduration=3.256906247 podStartE2EDuration="5.767686062s" podCreationTimestamp="2025-10-02 12:29:42 +0000 UTC" firstStartedPulling="2025-10-02 12:29:44.727201548 +0000 UTC m=+4785.277567912" lastFinishedPulling="2025-10-02 12:29:47.237981363 +0000 UTC m=+4787.788347727" observedRunningTime="2025-10-02 12:29:47.766108546 +0000 UTC m=+4788.316474910" watchObservedRunningTime="2025-10-02 12:29:47.767686062 +0000 UTC m=+4788.318052426" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.130866 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.134185 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.141239 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.229124 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.229232 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctr5j\" (UniqueName: \"kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.229265 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.330821 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.331218 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctr5j\" (UniqueName: \"kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.331273 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.331781 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.331845 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.367788 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctr5j\" (UniqueName: \"kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j\") pod \"dnsmasq-dns-5b7946d7b9-vwmcj\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.453745 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.738402 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:29:51 crc kubenswrapper[4929]: I1002 12:29:51.897443 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:29:51 crc kubenswrapper[4929]: W1002 12:29:51.900862 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb70b4c24_a40c_4f07_b87f_e129132e5f2e.slice/crio-38b0d1c2886e10677d29ffe083e839e3447bf0be691984092689855716305ea7 WatchSource:0}: Error finding container 38b0d1c2886e10677d29ffe083e839e3447bf0be691984092689855716305ea7: Status 404 returned error can't find the container with id 38b0d1c2886e10677d29ffe083e839e3447bf0be691984092689855716305ea7 Oct 02 12:29:52 crc kubenswrapper[4929]: I1002 12:29:52.364090 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:29:52 crc kubenswrapper[4929]: I1002 12:29:52.806078 4929 generic.go:334] "Generic (PLEG): container finished" podID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerID="e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e" exitCode=0 Oct 02 12:29:52 crc kubenswrapper[4929]: I1002 12:29:52.806124 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" event={"ID":"b70b4c24-a40c-4f07-b87f-e129132e5f2e","Type":"ContainerDied","Data":"e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e"} Oct 02 12:29:52 crc kubenswrapper[4929]: I1002 12:29:52.806149 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" event={"ID":"b70b4c24-a40c-4f07-b87f-e129132e5f2e","Type":"ContainerStarted","Data":"38b0d1c2886e10677d29ffe083e839e3447bf0be691984092689855716305ea7"} Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.157877 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:29:53 crc kubenswrapper[4929]: E1002 12:29:53.158179 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.237306 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.238061 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.288227 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.749701 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="rabbitmq" containerID="cri-o://bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb" gracePeriod=604798 Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.824301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" event={"ID":"b70b4c24-a40c-4f07-b87f-e129132e5f2e","Type":"ContainerStarted","Data":"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8"} Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.824395 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.843609 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" podStartSLOduration=2.8435942499999998 podStartE2EDuration="2.84359425s" podCreationTimestamp="2025-10-02 12:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:53.840640355 +0000 UTC m=+4794.391006719" watchObservedRunningTime="2025-10-02 12:29:53.84359425 +0000 UTC m=+4794.393960614" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.869923 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:53 crc kubenswrapper[4929]: I1002 12:29:53.911210 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:54 crc kubenswrapper[4929]: I1002 12:29:54.185986 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="rabbitmq" containerID="cri-o://c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845" gracePeriod=604799 Oct 02 12:29:55 crc kubenswrapper[4929]: I1002 12:29:55.834521 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-47jwl" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="registry-server" containerID="cri-o://9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e" gracePeriod=2 Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.269780 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.324065 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.405264 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities\") pod \"e2f65ef4-9098-49b3-88bc-c171519f3a60\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.405384 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content\") pod \"e2f65ef4-9098-49b3-88bc-c171519f3a60\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.405502 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5zg\" (UniqueName: \"kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg\") pod \"e2f65ef4-9098-49b3-88bc-c171519f3a60\" (UID: \"e2f65ef4-9098-49b3-88bc-c171519f3a60\") " Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.406251 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities" (OuterVolumeSpecName: "utilities") pod "e2f65ef4-9098-49b3-88bc-c171519f3a60" (UID: "e2f65ef4-9098-49b3-88bc-c171519f3a60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.410790 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg" (OuterVolumeSpecName: "kube-api-access-xz5zg") pod "e2f65ef4-9098-49b3-88bc-c171519f3a60" (UID: "e2f65ef4-9098-49b3-88bc-c171519f3a60"). InnerVolumeSpecName "kube-api-access-xz5zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.420609 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2f65ef4-9098-49b3-88bc-c171519f3a60" (UID: "e2f65ef4-9098-49b3-88bc-c171519f3a60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.506681 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz5zg\" (UniqueName: \"kubernetes.io/projected/e2f65ef4-9098-49b3-88bc-c171519f3a60-kube-api-access-xz5zg\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.506709 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.506721 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f65ef4-9098-49b3-88bc-c171519f3a60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.571687 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.239:5672: connect: connection refused" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.844107 4929 generic.go:334] "Generic (PLEG): container finished" podID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerID="9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e" exitCode=0 Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.844159 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerDied","Data":"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e"} Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.844188 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47jwl" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.844207 4929 scope.go:117] "RemoveContainer" containerID="9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.844188 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47jwl" event={"ID":"e2f65ef4-9098-49b3-88bc-c171519f3a60","Type":"ContainerDied","Data":"2a6b0b63f1145eff4d78c939c7b37c8a204d023b901806d636a66c7c0aab8f4f"} Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.868257 4929 scope.go:117] "RemoveContainer" containerID="a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.877736 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.885336 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-47jwl"] Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.900618 4929 scope.go:117] "RemoveContainer" containerID="c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.927345 4929 scope.go:117] "RemoveContainer" containerID="9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.928446 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.928756 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="registry-server" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.928773 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="registry-server" Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.928783 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="extract-content" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.928789 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="extract-content" Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.928821 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="extract-utilities" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.928827 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="extract-utilities" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.929021 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" containerName="registry-server" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.930057 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.930893 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e\": container with ID starting with 9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e not found: ID does not exist" containerID="9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.930942 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e"} err="failed to get container status \"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e\": rpc error: code = NotFound desc = could not find container \"9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e\": container with ID starting with 9dd45314831a1e22d7ab1b920c9c3167994ebd802231c7cfcb2da92e9d9bdc4e not found: ID does not exist" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.930988 4929 scope.go:117] "RemoveContainer" containerID="a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6" Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.931363 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6\": container with ID starting with a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6 not found: ID does not exist" containerID="a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.931388 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6"} err="failed to get container status \"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6\": rpc error: code = NotFound desc = could not find container \"a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6\": container with ID starting with a54902b687a9ca2fabe5e1e0731eaa4fd0f37a080c669d8b4a8f97a64849d7e6 not found: ID does not exist" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.931408 4929 scope.go:117] "RemoveContainer" containerID="c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e" Oct 02 12:29:56 crc kubenswrapper[4929]: E1002 12:29:56.931602 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e\": container with ID starting with c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e not found: ID does not exist" containerID="c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.931623 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e"} err="failed to get container status \"c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e\": rpc error: code = NotFound desc = could not find container \"c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e\": container with ID starting with c60fecb8b8fea927f41b9b262534a5c4e2dcc6a2ceda58cca736b97cc277577e not found: ID does not exist" Oct 02 12:29:56 crc kubenswrapper[4929]: I1002 12:29:56.942573 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.012758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.012815 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzmr\" (UniqueName: \"kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.012873 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.113740 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzmr\" (UniqueName: \"kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.114070 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.114257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.114645 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.114794 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.132563 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzmr\" (UniqueName: \"kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr\") pod \"community-operators-wpwn7\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.291189 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.625185 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.852429 4929 generic.go:334] "Generic (PLEG): container finished" podID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerID="8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a" exitCode=0 Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.852555 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerDied","Data":"8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a"} Oct 02 12:29:57 crc kubenswrapper[4929]: I1002 12:29:57.852791 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerStarted","Data":"857645a0e38dd3f8bfd45fd2e7392efd33d2d76391face27bfcf74384e4fbc3b"} Oct 02 12:29:58 crc kubenswrapper[4929]: I1002 12:29:58.167605 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f65ef4-9098-49b3-88bc-c171519f3a60" path="/var/lib/kubelet/pods/e2f65ef4-9098-49b3-88bc-c171519f3a60/volumes" Oct 02 12:29:58 crc kubenswrapper[4929]: I1002 12:29:58.867408 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerStarted","Data":"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9"} Oct 02 12:29:59 crc kubenswrapper[4929]: I1002 12:29:59.882109 4929 generic.go:334] "Generic (PLEG): container finished" podID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerID="ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9" exitCode=0 Oct 02 12:29:59 crc kubenswrapper[4929]: I1002 12:29:59.882205 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerDied","Data":"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.179109 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d"] Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.180610 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d"] Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.180720 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.183204 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.183880 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.268496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.269555 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4pg5\" (UniqueName: \"kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.269664 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.371203 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.371271 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4pg5\" (UniqueName: \"kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.371317 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.372361 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.380396 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.392302 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4pg5\" (UniqueName: \"kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5\") pod \"collect-profiles-29323470-qh64d\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.508464 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.569578 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.676639 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.676706 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.676782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.676839 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.676897 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6559\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.677124 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.677198 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.677227 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.677252 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd\") pod \"5359c6db-3207-412b-a6b6-2fb792dacd55\" (UID: \"5359c6db-3207-412b-a6b6-2fb792dacd55\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.678632 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.680717 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.680538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.685808 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info" (OuterVolumeSpecName: "pod-info") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.693795 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559" (OuterVolumeSpecName: "kube-api-access-f6559") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "kube-api-access-f6559". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.695283 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.701570 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9" (OuterVolumeSpecName: "persistence") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.717694 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf" (OuterVolumeSpecName: "server-conf") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.770337 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.778595 4929 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5359c6db-3207-412b-a6b6-2fb792dacd55-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.778826 4929 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.778888 4929 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5359c6db-3207-412b-a6b6-2fb792dacd55-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.778948 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.779027 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6559\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-kube-api-access-f6559\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.779122 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") on node \"crc\" " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.779187 4929 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5359c6db-3207-412b-a6b6-2fb792dacd55-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.779251 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.791166 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5359c6db-3207-412b-a6b6-2fb792dacd55" (UID: "5359c6db-3207-412b-a6b6-2fb792dacd55"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.834198 4929 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.834350 4929 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9") on node "crc" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.880692 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.880946 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881053 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881165 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881309 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881426 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpzz9\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881541 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881622 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881732 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.881852 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret\") pod \"3659a98f-4d3d-4d11-accb-82cb65ebff64\" (UID: \"3659a98f-4d3d-4d11-accb-82cb65ebff64\") " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.882282 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.883064 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.883179 4929 reconciler_common.go:293] "Volume detached for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.883266 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5359c6db-3207-412b-a6b6-2fb792dacd55-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.884740 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.885074 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info" (OuterVolumeSpecName: "pod-info") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.890426 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.896994 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09" (OuterVolumeSpecName: "persistence") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.897917 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9" (OuterVolumeSpecName: "kube-api-access-hpzz9") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "kube-api-access-hpzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.899592 4929 generic.go:334] "Generic (PLEG): container finished" podID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerID="bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb" exitCode=0 Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.899720 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.899776 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerDied","Data":"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.899823 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5359c6db-3207-412b-a6b6-2fb792dacd55","Type":"ContainerDied","Data":"e1fa4848f1c07cdfb5ecb805e1bd3ebbc5e14e4462b5a757ec2772911a12d922"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.899841 4929 scope.go:117] "RemoveContainer" containerID="bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.909085 4929 generic.go:334] "Generic (PLEG): container finished" podID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerID="c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845" exitCode=0 Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.909313 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.910344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerDied","Data":"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.910387 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3659a98f-4d3d-4d11-accb-82cb65ebff64","Type":"ContainerDied","Data":"17b6165508856bd2c1845a646f6baaebba78185ca4ef64053bde9e6574b73275"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.913076 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf" (OuterVolumeSpecName: "server-conf") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.915599 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerStarted","Data":"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48"} Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.916353 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d"] Oct 02 12:30:00 crc kubenswrapper[4929]: W1002 12:30:00.920933 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa29925_8e88_47ad_86c6_ef9db20ad61c.slice/crio-0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899 WatchSource:0}: Error finding container 0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899: Status 404 returned error can't find the container with id 0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899 Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.940043 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wpwn7" podStartSLOduration=2.436696107 podStartE2EDuration="4.940018725s" podCreationTimestamp="2025-10-02 12:29:56 +0000 UTC" firstStartedPulling="2025-10-02 12:29:57.855241296 +0000 UTC m=+4798.405607660" lastFinishedPulling="2025-10-02 12:30:00.358563914 +0000 UTC m=+4800.908930278" observedRunningTime="2025-10-02 12:30:00.931201561 +0000 UTC m=+4801.481567925" watchObservedRunningTime="2025-10-02 12:30:00.940018725 +0000 UTC m=+4801.490385089" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.944111 4929 scope.go:117] "RemoveContainer" containerID="cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.956631 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.965923 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.979776 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:30:00 crc kubenswrapper[4929]: E1002 12:30:00.980224 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980248 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: E1002 12:30:00.980270 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="setup-container" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980279 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="setup-container" Oct 02 12:30:00 crc kubenswrapper[4929]: E1002 12:30:00.980309 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="setup-container" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980319 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="setup-container" Oct 02 12:30:00 crc kubenswrapper[4929]: E1002 12:30:00.980334 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980341 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980520 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.980534 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" containerName="rabbitmq" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.981498 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.983419 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.984663 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.984843 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.985063 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.985311 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ttfj8" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989830 4929 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989863 4929 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3659a98f-4d3d-4d11-accb-82cb65ebff64-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989886 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") on node \"crc\" " Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989897 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpzz9\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-kube-api-access-hpzz9\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989910 4929 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3659a98f-4d3d-4d11-accb-82cb65ebff64-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989918 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:00 crc kubenswrapper[4929]: I1002 12:30:00.989926 4929 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3659a98f-4d3d-4d11-accb-82cb65ebff64-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.015413 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.031841 4929 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.032070 4929 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09") on node "crc" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.066392 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3659a98f-4d3d-4d11-accb-82cb65ebff64" (UID: "3659a98f-4d3d-4d11-accb-82cb65ebff64"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093692 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093744 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093764 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093802 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093829 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093855 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgf6s\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-kube-api-access-jgf6s\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093890 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093905 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093932 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093983 4929 reconciler_common.go:293] "Volume detached for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.093994 4929 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3659a98f-4d3d-4d11-accb-82cb65ebff64-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.131269 4929 scope.go:117] "RemoveContainer" containerID="bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb" Oct 02 12:30:01 crc kubenswrapper[4929]: E1002 12:30:01.131673 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb\": container with ID starting with bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb not found: ID does not exist" containerID="bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.131713 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb"} err="failed to get container status \"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb\": rpc error: code = NotFound desc = could not find container \"bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb\": container with ID starting with bf2fdd3c1ee4a0717c0de656ed2815c16c93e95f59e5e7486b80dc385a9949eb not found: ID does not exist" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.131743 4929 scope.go:117] "RemoveContainer" containerID="cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650" Oct 02 12:30:01 crc kubenswrapper[4929]: E1002 12:30:01.132119 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650\": container with ID starting with cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650 not found: ID does not exist" containerID="cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.132146 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650"} err="failed to get container status \"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650\": rpc error: code = NotFound desc = could not find container \"cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650\": container with ID starting with cf3caf95f841cb6389621feb33ad5d435a3b4527c91826eb89eda94aa3bb8650 not found: ID does not exist" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.132163 4929 scope.go:117] "RemoveContainer" containerID="c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195604 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195656 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195692 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195730 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195752 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195772 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195807 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195823 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.195852 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgf6s\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-kube-api-access-jgf6s\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.196754 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.198441 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.199689 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.200133 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.200755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.200783 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.202741 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.202749 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.202822 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1744e4e3ac54f5e4179690cf1074bae9cd765942657cd19f77f4eea8bc18c758/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.216220 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgf6s\" (UniqueName: \"kubernetes.io/projected/f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3-kube-api-access-jgf6s\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.230777 4929 scope.go:117] "RemoveContainer" containerID="6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.237431 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-207e500e-830c-4d9d-a3e4-c0ffb9dd80d9\") pod \"rabbitmq-server-0\" (UID: \"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3\") " pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.244415 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.255255 4929 scope.go:117] "RemoveContainer" containerID="c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845" Oct 02 12:30:01 crc kubenswrapper[4929]: E1002 12:30:01.256082 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845\": container with ID starting with c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845 not found: ID does not exist" containerID="c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.256124 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845"} err="failed to get container status \"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845\": rpc error: code = NotFound desc = could not find container \"c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845\": container with ID starting with c90be8d64c4dd647a29e7892e0755adaf58fabb456437ea49838445dfc8d6845 not found: ID does not exist" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.256270 4929 scope.go:117] "RemoveContainer" containerID="6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8" Oct 02 12:30:01 crc kubenswrapper[4929]: E1002 12:30:01.256630 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8\": container with ID starting with 6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8 not found: ID does not exist" containerID="6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.256659 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8"} err="failed to get container status \"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8\": rpc error: code = NotFound desc = could not find container \"6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8\": container with ID starting with 6c6d86ada1a0beeb87a2b98f727105316c3a28f713354f5640ca64d29edb50c8 not found: ID does not exist" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.270831 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.276665 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.300486 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.301946 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.316557 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.316859 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.317188 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4fgqj" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.317434 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.333754 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.399300 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411217 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411301 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411399 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411446 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411568 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvkhp\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-kube-api-access-nvkhp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411643 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411769 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411894 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.411974 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.461882 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.515632 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.515938 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.515981 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516015 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516034 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516055 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516099 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkhp\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-kube-api-access-nvkhp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516129 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516175 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.516616 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.525512 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.527866 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.528642 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.530304 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.536604 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.537780 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.539585 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.540910 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="dnsmasq-dns" containerID="cri-o://fb2f9850195c10ad70a0a03b738f0ece3394c3b293b696b05ab2a75f25c3c9da" gracePeriod=10 Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.562589 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkhp\" (UniqueName: \"kubernetes.io/projected/a43f80b5-2330-4ade-9fe5-80bcb95a36b5-kube-api-access-nvkhp\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.569162 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.569213 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29c6a0ce90f8e2fd6fc75870c9f9a54b23e28d5e81065606be22fd560126963f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.674470 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8066dc74-9d85-4ea3-a14c-2f2b2f17eb09\") pod \"rabbitmq-cell1-server-0\" (UID: \"a43f80b5-2330-4ade-9fe5-80bcb95a36b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:01 crc kubenswrapper[4929]: I1002 12:30:01.688771 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.953210 4929 generic.go:334] "Generic (PLEG): container finished" podID="312f7cef-f108-4b04-a009-32e0303c16b9" containerID="fb2f9850195c10ad70a0a03b738f0ece3394c3b293b696b05ab2a75f25c3c9da" exitCode=0 Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.956673 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" event={"ID":"312f7cef-f108-4b04-a009-32e0303c16b9","Type":"ContainerDied","Data":"fb2f9850195c10ad70a0a03b738f0ece3394c3b293b696b05ab2a75f25c3c9da"} Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.962304 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.967833 4929 generic.go:334] "Generic (PLEG): container finished" podID="bfa29925-8e88-47ad-86c6-ef9db20ad61c" containerID="3dcf81ec885be28d70ab6d39d2ed9b6f3c5248dc831b2544084ff93bd32ed526" exitCode=0 Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.968881 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" event={"ID":"bfa29925-8e88-47ad-86c6-ef9db20ad61c","Type":"ContainerDied","Data":"3dcf81ec885be28d70ab6d39d2ed9b6f3c5248dc831b2544084ff93bd32ed526"} Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:01.968941 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" event={"ID":"bfa29925-8e88-47ad-86c6-ef9db20ad61c","Type":"ContainerStarted","Data":"0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899"} Oct 02 12:30:02 crc kubenswrapper[4929]: W1002 12:30:01.988181 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ea066c_e1c6_43a4_9e4e_2783a95b8ba3.slice/crio-2cdc54a3851ac4bcfc1381519365b573b6b4674db4051052e2b08b8962ce3bf1 WatchSource:0}: Error finding container 2cdc54a3851ac4bcfc1381519365b573b6b4674db4051052e2b08b8962ce3bf1: Status 404 returned error can't find the container with id 2cdc54a3851ac4bcfc1381519365b573b6b4674db4051052e2b08b8962ce3bf1 Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.040497 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.146374 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.175437 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3659a98f-4d3d-4d11-accb-82cb65ebff64" path="/var/lib/kubelet/pods/3659a98f-4d3d-4d11-accb-82cb65ebff64/volumes" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.176171 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5359c6db-3207-412b-a6b6-2fb792dacd55" path="/var/lib/kubelet/pods/5359c6db-3207-412b-a6b6-2fb792dacd55/volumes" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.329507 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc\") pod \"312f7cef-f108-4b04-a009-32e0303c16b9\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.329830 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjllq\" (UniqueName: \"kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq\") pod \"312f7cef-f108-4b04-a009-32e0303c16b9\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.329984 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config\") pod \"312f7cef-f108-4b04-a009-32e0303c16b9\" (UID: \"312f7cef-f108-4b04-a009-32e0303c16b9\") " Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.333985 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq" (OuterVolumeSpecName: "kube-api-access-hjllq") pod "312f7cef-f108-4b04-a009-32e0303c16b9" (UID: "312f7cef-f108-4b04-a009-32e0303c16b9"). InnerVolumeSpecName "kube-api-access-hjllq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.375512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config" (OuterVolumeSpecName: "config") pod "312f7cef-f108-4b04-a009-32e0303c16b9" (UID: "312f7cef-f108-4b04-a009-32e0303c16b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.377681 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "312f7cef-f108-4b04-a009-32e0303c16b9" (UID: "312f7cef-f108-4b04-a009-32e0303c16b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.432129 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjllq\" (UniqueName: \"kubernetes.io/projected/312f7cef-f108-4b04-a009-32e0303c16b9-kube-api-access-hjllq\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.432161 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.432176 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312f7cef-f108-4b04-a009-32e0303c16b9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.976609 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3","Type":"ContainerStarted","Data":"2cdc54a3851ac4bcfc1381519365b573b6b4674db4051052e2b08b8962ce3bf1"} Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.982709 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" event={"ID":"312f7cef-f108-4b04-a009-32e0303c16b9","Type":"ContainerDied","Data":"c66ef47a030efd785af06744dbf1e482759a78d525411c1968eccb9f1d6eac72"} Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.982753 4929 scope.go:117] "RemoveContainer" containerID="fb2f9850195c10ad70a0a03b738f0ece3394c3b293b696b05ab2a75f25c3c9da" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.982775 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x2dhw" Oct 02 12:30:02 crc kubenswrapper[4929]: I1002 12:30:02.984393 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a43f80b5-2330-4ade-9fe5-80bcb95a36b5","Type":"ContainerStarted","Data":"b60622d762bbb7ceae98aad3e0e70bb01129bcb5875456107e849e4cb72e59c0"} Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.061647 4929 scope.go:117] "RemoveContainer" containerID="994f6ae895d895365305ce2b2517f3c8079d463c9a46d37e48fbb292997351b6" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.066812 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.075381 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x2dhw"] Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.291139 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.445797 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume\") pod \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.445853 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume\") pod \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.445929 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4pg5\" (UniqueName: \"kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5\") pod \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\" (UID: \"bfa29925-8e88-47ad-86c6-ef9db20ad61c\") " Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.447737 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfa29925-8e88-47ad-86c6-ef9db20ad61c" (UID: "bfa29925-8e88-47ad-86c6-ef9db20ad61c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.451133 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfa29925-8e88-47ad-86c6-ef9db20ad61c" (UID: "bfa29925-8e88-47ad-86c6-ef9db20ad61c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.451355 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5" (OuterVolumeSpecName: "kube-api-access-q4pg5") pod "bfa29925-8e88-47ad-86c6-ef9db20ad61c" (UID: "bfa29925-8e88-47ad-86c6-ef9db20ad61c"). InnerVolumeSpecName "kube-api-access-q4pg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.547491 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfa29925-8e88-47ad-86c6-ef9db20ad61c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.547760 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa29925-8e88-47ad-86c6-ef9db20ad61c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.547775 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4pg5\" (UniqueName: \"kubernetes.io/projected/bfa29925-8e88-47ad-86c6-ef9db20ad61c-kube-api-access-q4pg5\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.995914 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a43f80b5-2330-4ade-9fe5-80bcb95a36b5","Type":"ContainerStarted","Data":"62035e884fb1f39e1a043d7a6d6f2e39892f12ab71fe7a07f858cf6268e2d7e1"} Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.998280 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.998377 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d" event={"ID":"bfa29925-8e88-47ad-86c6-ef9db20ad61c","Type":"ContainerDied","Data":"0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899"} Oct 02 12:30:03 crc kubenswrapper[4929]: I1002 12:30:03.998412 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba374f1ded1b2d2a4ed95a553034ac266c7805d515f94da9a352cf622a40899" Oct 02 12:30:04 crc kubenswrapper[4929]: I1002 12:30:04.000160 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3","Type":"ContainerStarted","Data":"aada3e61f1d95262bf25cb5c7d2ca54f36ae61055d9b8caec7bdd28e5b7da3b4"} Oct 02 12:30:04 crc kubenswrapper[4929]: I1002 12:30:04.156622 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:30:04 crc kubenswrapper[4929]: E1002 12:30:04.156882 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:30:04 crc kubenswrapper[4929]: I1002 12:30:04.165703 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" path="/var/lib/kubelet/pods/312f7cef-f108-4b04-a009-32e0303c16b9/volumes" Oct 02 12:30:04 crc kubenswrapper[4929]: I1002 12:30:04.350319 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq"] Oct 02 12:30:04 crc kubenswrapper[4929]: I1002 12:30:04.354947 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-mhgwq"] Oct 02 12:30:06 crc kubenswrapper[4929]: I1002 12:30:06.166445 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff74db9-c703-44d9-9bcf-bc6ef951a464" path="/var/lib/kubelet/pods/dff74db9-c703-44d9-9bcf-bc6ef951a464/volumes" Oct 02 12:30:07 crc kubenswrapper[4929]: I1002 12:30:07.292224 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:07 crc kubenswrapper[4929]: I1002 12:30:07.293031 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:07 crc kubenswrapper[4929]: I1002 12:30:07.339896 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:08 crc kubenswrapper[4929]: I1002 12:30:08.077910 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:08 crc kubenswrapper[4929]: I1002 12:30:08.122264 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:30:10 crc kubenswrapper[4929]: I1002 12:30:10.051792 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wpwn7" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="registry-server" containerID="cri-o://8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48" gracePeriod=2 Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.053471 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.060937 4929 generic.go:334] "Generic (PLEG): container finished" podID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerID="8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48" exitCode=0 Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.060992 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerDied","Data":"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48"} Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.061020 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpwn7" event={"ID":"72d13362-18e8-4d20-9b35-9c3cc5d573c2","Type":"ContainerDied","Data":"857645a0e38dd3f8bfd45fd2e7392efd33d2d76391face27bfcf74384e4fbc3b"} Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.061044 4929 scope.go:117] "RemoveContainer" containerID="8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.061059 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpwn7" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.086772 4929 scope.go:117] "RemoveContainer" containerID="ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.106527 4929 scope.go:117] "RemoveContainer" containerID="8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.133367 4929 scope.go:117] "RemoveContainer" containerID="8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48" Oct 02 12:30:11 crc kubenswrapper[4929]: E1002 12:30:11.134001 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48\": container with ID starting with 8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48 not found: ID does not exist" containerID="8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.134049 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48"} err="failed to get container status \"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48\": rpc error: code = NotFound desc = could not find container \"8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48\": container with ID starting with 8124266815ee626e78a3ad1d6b90fb084ac986348bd419422e3880a84a5ecc48 not found: ID does not exist" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.134089 4929 scope.go:117] "RemoveContainer" containerID="ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9" Oct 02 12:30:11 crc kubenswrapper[4929]: E1002 12:30:11.134482 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9\": container with ID starting with ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9 not found: ID does not exist" containerID="ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.134528 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9"} err="failed to get container status \"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9\": rpc error: code = NotFound desc = could not find container \"ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9\": container with ID starting with ac87d9daf69725bb7959b6ec176cdd5c5bed99ba2031584a05fbbb24cff74fa9 not found: ID does not exist" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.134565 4929 scope.go:117] "RemoveContainer" containerID="8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a" Oct 02 12:30:11 crc kubenswrapper[4929]: E1002 12:30:11.134942 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a\": container with ID starting with 8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a not found: ID does not exist" containerID="8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.134996 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a"} err="failed to get container status \"8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a\": rpc error: code = NotFound desc = could not find container \"8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a\": container with ID starting with 8fabc727c17e2f0024a01e207824019a28333d6fb7f89389627717ae9076c81a not found: ID does not exist" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.162441 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content\") pod \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.162749 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpzmr\" (UniqueName: \"kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr\") pod \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.162833 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities\") pod \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\" (UID: \"72d13362-18e8-4d20-9b35-9c3cc5d573c2\") " Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.164316 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities" (OuterVolumeSpecName: "utilities") pod "72d13362-18e8-4d20-9b35-9c3cc5d573c2" (UID: "72d13362-18e8-4d20-9b35-9c3cc5d573c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.168939 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr" (OuterVolumeSpecName: "kube-api-access-mpzmr") pod "72d13362-18e8-4d20-9b35-9c3cc5d573c2" (UID: "72d13362-18e8-4d20-9b35-9c3cc5d573c2"). InnerVolumeSpecName "kube-api-access-mpzmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.258604 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72d13362-18e8-4d20-9b35-9c3cc5d573c2" (UID: "72d13362-18e8-4d20-9b35-9c3cc5d573c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.266933 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpzmr\" (UniqueName: \"kubernetes.io/projected/72d13362-18e8-4d20-9b35-9c3cc5d573c2-kube-api-access-mpzmr\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.267148 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.267158 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72d13362-18e8-4d20-9b35-9c3cc5d573c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.402931 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:30:11 crc kubenswrapper[4929]: I1002 12:30:11.413047 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wpwn7"] Oct 02 12:30:12 crc kubenswrapper[4929]: I1002 12:30:12.167146 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" path="/var/lib/kubelet/pods/72d13362-18e8-4d20-9b35-9c3cc5d573c2/volumes" Oct 02 12:30:15 crc kubenswrapper[4929]: I1002 12:30:15.350869 4929 scope.go:117] "RemoveContainer" containerID="2596cc18370ca8142983eb8609f5ad7eb686de837856952e3d5893619d47cb34" Oct 02 12:30:19 crc kubenswrapper[4929]: I1002 12:30:19.156911 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:30:19 crc kubenswrapper[4929]: E1002 12:30:19.157574 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:30:34 crc kubenswrapper[4929]: I1002 12:30:34.156999 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:30:34 crc kubenswrapper[4929]: E1002 12:30:34.157743 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:30:35 crc kubenswrapper[4929]: I1002 12:30:35.237520 4929 generic.go:334] "Generic (PLEG): container finished" podID="f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3" containerID="aada3e61f1d95262bf25cb5c7d2ca54f36ae61055d9b8caec7bdd28e5b7da3b4" exitCode=0 Oct 02 12:30:35 crc kubenswrapper[4929]: I1002 12:30:35.237636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3","Type":"ContainerDied","Data":"aada3e61f1d95262bf25cb5c7d2ca54f36ae61055d9b8caec7bdd28e5b7da3b4"} Oct 02 12:30:35 crc kubenswrapper[4929]: I1002 12:30:35.241684 4929 generic.go:334] "Generic (PLEG): container finished" podID="a43f80b5-2330-4ade-9fe5-80bcb95a36b5" containerID="62035e884fb1f39e1a043d7a6d6f2e39892f12ab71fe7a07f858cf6268e2d7e1" exitCode=0 Oct 02 12:30:35 crc kubenswrapper[4929]: I1002 12:30:35.241732 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a43f80b5-2330-4ade-9fe5-80bcb95a36b5","Type":"ContainerDied","Data":"62035e884fb1f39e1a043d7a6d6f2e39892f12ab71fe7a07f858cf6268e2d7e1"} Oct 02 12:30:36 crc kubenswrapper[4929]: I1002 12:30:36.251103 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3","Type":"ContainerStarted","Data":"cbaeea08392b2926918850cea5174a9bece8ced33ed9e1a474233556596a58ff"} Oct 02 12:30:36 crc kubenswrapper[4929]: I1002 12:30:36.251564 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 12:30:36 crc kubenswrapper[4929]: I1002 12:30:36.253140 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a43f80b5-2330-4ade-9fe5-80bcb95a36b5","Type":"ContainerStarted","Data":"fbfc6695ff84e85843dfb73a2e079df81bd8d0bb3328d0133e1dad70f3910332"} Oct 02 12:30:36 crc kubenswrapper[4929]: I1002 12:30:36.253351 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:36 crc kubenswrapper[4929]: I1002 12:30:36.275783 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.275764273 podStartE2EDuration="36.275764273s" podCreationTimestamp="2025-10-02 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:36.27080407 +0000 UTC m=+4836.821170444" watchObservedRunningTime="2025-10-02 12:30:36.275764273 +0000 UTC m=+4836.826130637" Oct 02 12:30:46 crc kubenswrapper[4929]: I1002 12:30:46.157133 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:30:47 crc kubenswrapper[4929]: I1002 12:30:47.327663 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a"} Oct 02 12:30:47 crc kubenswrapper[4929]: I1002 12:30:47.352202 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.352184412 podStartE2EDuration="46.352184412s" podCreationTimestamp="2025-10-02 12:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:36.304286885 +0000 UTC m=+4836.854653249" watchObservedRunningTime="2025-10-02 12:30:47.352184412 +0000 UTC m=+4847.902550776" Oct 02 12:30:51 crc kubenswrapper[4929]: I1002 12:30:51.247234 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 12:30:51 crc kubenswrapper[4929]: I1002 12:30:51.692216 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.151461 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152312 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="init" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152325 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="init" Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152346 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="dnsmasq-dns" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152354 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="dnsmasq-dns" Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152374 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="extract-utilities" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152380 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="extract-utilities" Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152389 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa29925-8e88-47ad-86c6-ef9db20ad61c" containerName="collect-profiles" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152396 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa29925-8e88-47ad-86c6-ef9db20ad61c" containerName="collect-profiles" Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152406 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="registry-server" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152412 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="registry-server" Oct 02 12:30:58 crc kubenswrapper[4929]: E1002 12:30:58.152425 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="extract-content" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152431 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="extract-content" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152572 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa29925-8e88-47ad-86c6-ef9db20ad61c" containerName="collect-profiles" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152592 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d13362-18e8-4d20-9b35-9c3cc5d573c2" containerName="registry-server" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.152604 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="312f7cef-f108-4b04-a009-32e0303c16b9" containerName="dnsmasq-dns" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.153188 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.155502 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-x9n8v" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.167913 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.351912 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9zfj\" (UniqueName: \"kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj\") pod \"mariadb-client-1-default\" (UID: \"95e3b289-5251-4962-9557-1eb602aad43b\") " pod="openstack/mariadb-client-1-default" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.453694 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9zfj\" (UniqueName: \"kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj\") pod \"mariadb-client-1-default\" (UID: \"95e3b289-5251-4962-9557-1eb602aad43b\") " pod="openstack/mariadb-client-1-default" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.472277 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9zfj\" (UniqueName: \"kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj\") pod \"mariadb-client-1-default\" (UID: \"95e3b289-5251-4962-9557-1eb602aad43b\") " pod="openstack/mariadb-client-1-default" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.472867 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:30:58 crc kubenswrapper[4929]: I1002 12:30:58.980499 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:30:58 crc kubenswrapper[4929]: W1002 12:30:58.988550 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95e3b289_5251_4962_9557_1eb602aad43b.slice/crio-a3cb8abb5cc3dd7a5d64b347f1edf4c27a75d1910811da468e7d4c651af6a140 WatchSource:0}: Error finding container a3cb8abb5cc3dd7a5d64b347f1edf4c27a75d1910811da468e7d4c651af6a140: Status 404 returned error can't find the container with id a3cb8abb5cc3dd7a5d64b347f1edf4c27a75d1910811da468e7d4c651af6a140 Oct 02 12:30:59 crc kubenswrapper[4929]: I1002 12:30:59.423080 4929 generic.go:334] "Generic (PLEG): container finished" podID="95e3b289-5251-4962-9557-1eb602aad43b" containerID="eaee440912839b3b26d154dec2e2d1171744c2e24e658fd15370533bb6976416" exitCode=0 Oct 02 12:30:59 crc kubenswrapper[4929]: I1002 12:30:59.423122 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"95e3b289-5251-4962-9557-1eb602aad43b","Type":"ContainerDied","Data":"eaee440912839b3b26d154dec2e2d1171744c2e24e658fd15370533bb6976416"} Oct 02 12:30:59 crc kubenswrapper[4929]: I1002 12:30:59.423149 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"95e3b289-5251-4962-9557-1eb602aad43b","Type":"ContainerStarted","Data":"a3cb8abb5cc3dd7a5d64b347f1edf4c27a75d1910811da468e7d4c651af6a140"} Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.782831 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.810592 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_95e3b289-5251-4962-9557-1eb602aad43b/mariadb-client-1-default/0.log" Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.833676 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.858048 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.887389 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9zfj\" (UniqueName: \"kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj\") pod \"95e3b289-5251-4962-9557-1eb602aad43b\" (UID: \"95e3b289-5251-4962-9557-1eb602aad43b\") " Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.894121 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj" (OuterVolumeSpecName: "kube-api-access-p9zfj") pod "95e3b289-5251-4962-9557-1eb602aad43b" (UID: "95e3b289-5251-4962-9557-1eb602aad43b"). InnerVolumeSpecName "kube-api-access-p9zfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:00 crc kubenswrapper[4929]: I1002 12:31:00.989710 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9zfj\" (UniqueName: \"kubernetes.io/projected/95e3b289-5251-4962-9557-1eb602aad43b-kube-api-access-p9zfj\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.242266 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:31:01 crc kubenswrapper[4929]: E1002 12:31:01.242859 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e3b289-5251-4962-9557-1eb602aad43b" containerName="mariadb-client-1-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.242875 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e3b289-5251-4962-9557-1eb602aad43b" containerName="mariadb-client-1-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.243103 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e3b289-5251-4962-9557-1eb602aad43b" containerName="mariadb-client-1-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.243609 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.249154 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.397911 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8pml\" (UniqueName: \"kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml\") pod \"mariadb-client-2-default\" (UID: \"bd6ad37d-17fb-45b3-af3c-47c150a8034c\") " pod="openstack/mariadb-client-2-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.439039 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3cb8abb5cc3dd7a5d64b347f1edf4c27a75d1910811da468e7d4c651af6a140" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.439098 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.499493 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8pml\" (UniqueName: \"kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml\") pod \"mariadb-client-2-default\" (UID: \"bd6ad37d-17fb-45b3-af3c-47c150a8034c\") " pod="openstack/mariadb-client-2-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.519475 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8pml\" (UniqueName: \"kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml\") pod \"mariadb-client-2-default\" (UID: \"bd6ad37d-17fb-45b3-af3c-47c150a8034c\") " pod="openstack/mariadb-client-2-default" Oct 02 12:31:01 crc kubenswrapper[4929]: I1002 12:31:01.562951 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:31:02 crc kubenswrapper[4929]: I1002 12:31:02.082521 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:31:02 crc kubenswrapper[4929]: W1002 12:31:02.085282 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd6ad37d_17fb_45b3_af3c_47c150a8034c.slice/crio-eaf1bd45f6f0d3a6ca6e16f0613f1e927346c552822aa7252bcfb3e81810fd0e WatchSource:0}: Error finding container eaf1bd45f6f0d3a6ca6e16f0613f1e927346c552822aa7252bcfb3e81810fd0e: Status 404 returned error can't find the container with id eaf1bd45f6f0d3a6ca6e16f0613f1e927346c552822aa7252bcfb3e81810fd0e Oct 02 12:31:02 crc kubenswrapper[4929]: I1002 12:31:02.171425 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e3b289-5251-4962-9557-1eb602aad43b" path="/var/lib/kubelet/pods/95e3b289-5251-4962-9557-1eb602aad43b/volumes" Oct 02 12:31:02 crc kubenswrapper[4929]: I1002 12:31:02.447179 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"bd6ad37d-17fb-45b3-af3c-47c150a8034c","Type":"ContainerStarted","Data":"e8fe31d0f1d66080d6e74715db9503181b2f6d36ac5a75fa94cbc8a84101b642"} Oct 02 12:31:02 crc kubenswrapper[4929]: I1002 12:31:02.447228 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"bd6ad37d-17fb-45b3-af3c-47c150a8034c","Type":"ContainerStarted","Data":"eaf1bd45f6f0d3a6ca6e16f0613f1e927346c552822aa7252bcfb3e81810fd0e"} Oct 02 12:31:03 crc kubenswrapper[4929]: I1002 12:31:03.455367 4929 generic.go:334] "Generic (PLEG): container finished" podID="bd6ad37d-17fb-45b3-af3c-47c150a8034c" containerID="e8fe31d0f1d66080d6e74715db9503181b2f6d36ac5a75fa94cbc8a84101b642" exitCode=0 Oct 02 12:31:03 crc kubenswrapper[4929]: I1002 12:31:03.455421 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"bd6ad37d-17fb-45b3-af3c-47c150a8034c","Type":"ContainerDied","Data":"e8fe31d0f1d66080d6e74715db9503181b2f6d36ac5a75fa94cbc8a84101b642"} Oct 02 12:31:04 crc kubenswrapper[4929]: I1002 12:31:04.811006 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:31:04 crc kubenswrapper[4929]: I1002 12:31:04.850410 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:31:04 crc kubenswrapper[4929]: I1002 12:31:04.857271 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:31:04 crc kubenswrapper[4929]: I1002 12:31:04.951763 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8pml\" (UniqueName: \"kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml\") pod \"bd6ad37d-17fb-45b3-af3c-47c150a8034c\" (UID: \"bd6ad37d-17fb-45b3-af3c-47c150a8034c\") " Oct 02 12:31:04 crc kubenswrapper[4929]: I1002 12:31:04.959324 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml" (OuterVolumeSpecName: "kube-api-access-j8pml") pod "bd6ad37d-17fb-45b3-af3c-47c150a8034c" (UID: "bd6ad37d-17fb-45b3-af3c-47c150a8034c"). InnerVolumeSpecName "kube-api-access-j8pml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.054067 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8pml\" (UniqueName: \"kubernetes.io/projected/bd6ad37d-17fb-45b3-af3c-47c150a8034c-kube-api-access-j8pml\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.259682 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:31:05 crc kubenswrapper[4929]: E1002 12:31:05.260218 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6ad37d-17fb-45b3-af3c-47c150a8034c" containerName="mariadb-client-2-default" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.260245 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6ad37d-17fb-45b3-af3c-47c150a8034c" containerName="mariadb-client-2-default" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.260667 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6ad37d-17fb-45b3-af3c-47c150a8034c" containerName="mariadb-client-2-default" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.261623 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.276897 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.365325 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxgg\" (UniqueName: \"kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg\") pod \"mariadb-client-1\" (UID: \"11516f98-f6ff-4c59-9965-2efaf8872571\") " pod="openstack/mariadb-client-1" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.466592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxgg\" (UniqueName: \"kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg\") pod \"mariadb-client-1\" (UID: \"11516f98-f6ff-4c59-9965-2efaf8872571\") " pod="openstack/mariadb-client-1" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.483853 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf1bd45f6f0d3a6ca6e16f0613f1e927346c552822aa7252bcfb3e81810fd0e" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.483941 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.491596 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxgg\" (UniqueName: \"kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg\") pod \"mariadb-client-1\" (UID: \"11516f98-f6ff-4c59-9965-2efaf8872571\") " pod="openstack/mariadb-client-1" Oct 02 12:31:05 crc kubenswrapper[4929]: I1002 12:31:05.582659 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:31:06 crc kubenswrapper[4929]: I1002 12:31:06.063782 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:31:06 crc kubenswrapper[4929]: I1002 12:31:06.166754 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6ad37d-17fb-45b3-af3c-47c150a8034c" path="/var/lib/kubelet/pods/bd6ad37d-17fb-45b3-af3c-47c150a8034c/volumes" Oct 02 12:31:06 crc kubenswrapper[4929]: E1002 12:31:06.347648 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11516f98_f6ff_4c59_9965_2efaf8872571.slice/crio-641c3b8c1f03b72cc6ee0b80b283c557e9887b7c738d92c1b0fd5515acc1eb6e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11516f98_f6ff_4c59_9965_2efaf8872571.slice/crio-conmon-641c3b8c1f03b72cc6ee0b80b283c557e9887b7c738d92c1b0fd5515acc1eb6e.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:31:06 crc kubenswrapper[4929]: I1002 12:31:06.492534 4929 generic.go:334] "Generic (PLEG): container finished" podID="11516f98-f6ff-4c59-9965-2efaf8872571" containerID="641c3b8c1f03b72cc6ee0b80b283c557e9887b7c738d92c1b0fd5515acc1eb6e" exitCode=0 Oct 02 12:31:06 crc kubenswrapper[4929]: I1002 12:31:06.492583 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"11516f98-f6ff-4c59-9965-2efaf8872571","Type":"ContainerDied","Data":"641c3b8c1f03b72cc6ee0b80b283c557e9887b7c738d92c1b0fd5515acc1eb6e"} Oct 02 12:31:06 crc kubenswrapper[4929]: I1002 12:31:06.492615 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"11516f98-f6ff-4c59-9965-2efaf8872571","Type":"ContainerStarted","Data":"0d1ee985f57bd9be318f35d6f84f611c11bac62682747377cba23479916d0105"} Oct 02 12:31:07 crc kubenswrapper[4929]: I1002 12:31:07.823036 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:31:07 crc kubenswrapper[4929]: I1002 12:31:07.838635 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_11516f98-f6ff-4c59-9965-2efaf8872571/mariadb-client-1/0.log" Oct 02 12:31:07 crc kubenswrapper[4929]: I1002 12:31:07.861461 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:31:07 crc kubenswrapper[4929]: I1002 12:31:07.871725 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.002259 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bxgg\" (UniqueName: \"kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg\") pod \"11516f98-f6ff-4c59-9965-2efaf8872571\" (UID: \"11516f98-f6ff-4c59-9965-2efaf8872571\") " Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.008513 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg" (OuterVolumeSpecName: "kube-api-access-8bxgg") pod "11516f98-f6ff-4c59-9965-2efaf8872571" (UID: "11516f98-f6ff-4c59-9965-2efaf8872571"). InnerVolumeSpecName "kube-api-access-8bxgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.103741 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bxgg\" (UniqueName: \"kubernetes.io/projected/11516f98-f6ff-4c59-9965-2efaf8872571-kube-api-access-8bxgg\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.168618 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11516f98-f6ff-4c59-9965-2efaf8872571" path="/var/lib/kubelet/pods/11516f98-f6ff-4c59-9965-2efaf8872571/volumes" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.237514 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:31:08 crc kubenswrapper[4929]: E1002 12:31:08.237926 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11516f98-f6ff-4c59-9965-2efaf8872571" containerName="mariadb-client-1" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.237947 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="11516f98-f6ff-4c59-9965-2efaf8872571" containerName="mariadb-client-1" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.238178 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="11516f98-f6ff-4c59-9965-2efaf8872571" containerName="mariadb-client-1" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.239221 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.246617 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.408765 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfqj\" (UniqueName: \"kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj\") pod \"mariadb-client-4-default\" (UID: \"d01f175e-010c-4294-8ac9-a2b08f641adf\") " pod="openstack/mariadb-client-4-default" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.508926 4929 scope.go:117] "RemoveContainer" containerID="641c3b8c1f03b72cc6ee0b80b283c557e9887b7c738d92c1b0fd5515acc1eb6e" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.509103 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.511031 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfqj\" (UniqueName: \"kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj\") pod \"mariadb-client-4-default\" (UID: \"d01f175e-010c-4294-8ac9-a2b08f641adf\") " pod="openstack/mariadb-client-4-default" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.528765 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfqj\" (UniqueName: \"kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj\") pod \"mariadb-client-4-default\" (UID: \"d01f175e-010c-4294-8ac9-a2b08f641adf\") " pod="openstack/mariadb-client-4-default" Oct 02 12:31:08 crc kubenswrapper[4929]: I1002 12:31:08.557978 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:31:09 crc kubenswrapper[4929]: I1002 12:31:09.083134 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:31:09 crc kubenswrapper[4929]: I1002 12:31:09.520479 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d01f175e-010c-4294-8ac9-a2b08f641adf","Type":"ContainerStarted","Data":"cb61dba8925f00f807d3cb38baab2a58d9aae633fbcef00ee10d4d6238a87d6d"} Oct 02 12:31:09 crc kubenswrapper[4929]: I1002 12:31:09.520528 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d01f175e-010c-4294-8ac9-a2b08f641adf","Type":"ContainerStarted","Data":"10af2596e820ca6853d91f25f1d466fa470aa3c201a864808384cd7ed0133858"} Oct 02 12:31:09 crc kubenswrapper[4929]: I1002 12:31:09.539646 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-4-default" podStartSLOduration=1.53962757 podStartE2EDuration="1.53962757s" podCreationTimestamp="2025-10-02 12:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:31:09.534032849 +0000 UTC m=+4870.084399213" watchObservedRunningTime="2025-10-02 12:31:09.53962757 +0000 UTC m=+4870.089993934" Oct 02 12:31:09 crc kubenswrapper[4929]: I1002 12:31:09.574408 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_d01f175e-010c-4294-8ac9-a2b08f641adf/mariadb-client-4-default/0.log" Oct 02 12:31:10 crc kubenswrapper[4929]: I1002 12:31:10.529335 4929 generic.go:334] "Generic (PLEG): container finished" podID="d01f175e-010c-4294-8ac9-a2b08f641adf" containerID="cb61dba8925f00f807d3cb38baab2a58d9aae633fbcef00ee10d4d6238a87d6d" exitCode=0 Oct 02 12:31:10 crc kubenswrapper[4929]: I1002 12:31:10.529430 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"d01f175e-010c-4294-8ac9-a2b08f641adf","Type":"ContainerDied","Data":"cb61dba8925f00f807d3cb38baab2a58d9aae633fbcef00ee10d4d6238a87d6d"} Oct 02 12:31:11 crc kubenswrapper[4929]: I1002 12:31:11.867780 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:31:11 crc kubenswrapper[4929]: I1002 12:31:11.904799 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:31:11 crc kubenswrapper[4929]: I1002 12:31:11.913355 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:31:11 crc kubenswrapper[4929]: I1002 12:31:11.959855 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfqj\" (UniqueName: \"kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj\") pod \"d01f175e-010c-4294-8ac9-a2b08f641adf\" (UID: \"d01f175e-010c-4294-8ac9-a2b08f641adf\") " Oct 02 12:31:11 crc kubenswrapper[4929]: I1002 12:31:11.965279 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj" (OuterVolumeSpecName: "kube-api-access-9dfqj") pod "d01f175e-010c-4294-8ac9-a2b08f641adf" (UID: "d01f175e-010c-4294-8ac9-a2b08f641adf"). InnerVolumeSpecName "kube-api-access-9dfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:12 crc kubenswrapper[4929]: I1002 12:31:12.062352 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfqj\" (UniqueName: \"kubernetes.io/projected/d01f175e-010c-4294-8ac9-a2b08f641adf-kube-api-access-9dfqj\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:12 crc kubenswrapper[4929]: I1002 12:31:12.165853 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01f175e-010c-4294-8ac9-a2b08f641adf" path="/var/lib/kubelet/pods/d01f175e-010c-4294-8ac9-a2b08f641adf/volumes" Oct 02 12:31:12 crc kubenswrapper[4929]: I1002 12:31:12.552380 4929 scope.go:117] "RemoveContainer" containerID="cb61dba8925f00f807d3cb38baab2a58d9aae633fbcef00ee10d4d6238a87d6d" Oct 02 12:31:12 crc kubenswrapper[4929]: I1002 12:31:12.552410 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.183260 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:31:16 crc kubenswrapper[4929]: E1002 12:31:16.184288 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01f175e-010c-4294-8ac9-a2b08f641adf" containerName="mariadb-client-4-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.184306 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01f175e-010c-4294-8ac9-a2b08f641adf" containerName="mariadb-client-4-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.184512 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01f175e-010c-4294-8ac9-a2b08f641adf" containerName="mariadb-client-4-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.185065 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.239583 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-x9n8v" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.245049 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.342425 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvvh\" (UniqueName: \"kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh\") pod \"mariadb-client-5-default\" (UID: \"05e8c2bc-1cd3-4873-a394-39c964569720\") " pod="openstack/mariadb-client-5-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.444175 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvvh\" (UniqueName: \"kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh\") pod \"mariadb-client-5-default\" (UID: \"05e8c2bc-1cd3-4873-a394-39c964569720\") " pod="openstack/mariadb-client-5-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.463050 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvvh\" (UniqueName: \"kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh\") pod \"mariadb-client-5-default\" (UID: \"05e8c2bc-1cd3-4873-a394-39c964569720\") " pod="openstack/mariadb-client-5-default" Oct 02 12:31:16 crc kubenswrapper[4929]: I1002 12:31:16.555468 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:31:17 crc kubenswrapper[4929]: I1002 12:31:17.035522 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:31:17 crc kubenswrapper[4929]: I1002 12:31:17.606913 4929 generic.go:334] "Generic (PLEG): container finished" podID="05e8c2bc-1cd3-4873-a394-39c964569720" containerID="e8dfc5fbb9dfad623299b357a39c87237d6bc1d449764ea0f022cdda7c40fa6d" exitCode=0 Oct 02 12:31:17 crc kubenswrapper[4929]: I1002 12:31:17.607007 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"05e8c2bc-1cd3-4873-a394-39c964569720","Type":"ContainerDied","Data":"e8dfc5fbb9dfad623299b357a39c87237d6bc1d449764ea0f022cdda7c40fa6d"} Oct 02 12:31:17 crc kubenswrapper[4929]: I1002 12:31:17.607231 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"05e8c2bc-1cd3-4873-a394-39c964569720","Type":"ContainerStarted","Data":"7af0c639bac0d6a4216282592cc45af6593e57820e97eba66ecdaff00d33343f"} Oct 02 12:31:18 crc kubenswrapper[4929]: I1002 12:31:18.935084 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:31:18 crc kubenswrapper[4929]: I1002 12:31:18.956425 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_05e8c2bc-1cd3-4873-a394-39c964569720/mariadb-client-5-default/0.log" Oct 02 12:31:18 crc kubenswrapper[4929]: I1002 12:31:18.982184 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:31:18 crc kubenswrapper[4929]: I1002 12:31:18.987781 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.080730 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxvvh\" (UniqueName: \"kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh\") pod \"05e8c2bc-1cd3-4873-a394-39c964569720\" (UID: \"05e8c2bc-1cd3-4873-a394-39c964569720\") " Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.087089 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh" (OuterVolumeSpecName: "kube-api-access-gxvvh") pod "05e8c2bc-1cd3-4873-a394-39c964569720" (UID: "05e8c2bc-1cd3-4873-a394-39c964569720"). InnerVolumeSpecName "kube-api-access-gxvvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.122175 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:31:19 crc kubenswrapper[4929]: E1002 12:31:19.122752 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e8c2bc-1cd3-4873-a394-39c964569720" containerName="mariadb-client-5-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.122775 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e8c2bc-1cd3-4873-a394-39c964569720" containerName="mariadb-client-5-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.122951 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e8c2bc-1cd3-4873-a394-39c964569720" containerName="mariadb-client-5-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.123605 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.129075 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.182212 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxvvh\" (UniqueName: \"kubernetes.io/projected/05e8c2bc-1cd3-4873-a394-39c964569720-kube-api-access-gxvvh\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.283497 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7xm\" (UniqueName: \"kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm\") pod \"mariadb-client-6-default\" (UID: \"a3f6a144-b061-4cf3-9594-494f8d42c1c2\") " pod="openstack/mariadb-client-6-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.385411 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7xm\" (UniqueName: \"kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm\") pod \"mariadb-client-6-default\" (UID: \"a3f6a144-b061-4cf3-9594-494f8d42c1c2\") " pod="openstack/mariadb-client-6-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.403363 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7xm\" (UniqueName: \"kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm\") pod \"mariadb-client-6-default\" (UID: \"a3f6a144-b061-4cf3-9594-494f8d42c1c2\") " pod="openstack/mariadb-client-6-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.451679 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.628572 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7af0c639bac0d6a4216282592cc45af6593e57820e97eba66ecdaff00d33343f" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.628632 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:31:19 crc kubenswrapper[4929]: I1002 12:31:19.935369 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:31:19 crc kubenswrapper[4929]: W1002 12:31:19.943150 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3f6a144_b061_4cf3_9594_494f8d42c1c2.slice/crio-71270fcf5cdc60e04700cfd454ea42e9bad792865190292585c4afecb2c52ced WatchSource:0}: Error finding container 71270fcf5cdc60e04700cfd454ea42e9bad792865190292585c4afecb2c52ced: Status 404 returned error can't find the container with id 71270fcf5cdc60e04700cfd454ea42e9bad792865190292585c4afecb2c52ced Oct 02 12:31:20 crc kubenswrapper[4929]: I1002 12:31:20.166445 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e8c2bc-1cd3-4873-a394-39c964569720" path="/var/lib/kubelet/pods/05e8c2bc-1cd3-4873-a394-39c964569720/volumes" Oct 02 12:31:20 crc kubenswrapper[4929]: I1002 12:31:20.641001 4929 generic.go:334] "Generic (PLEG): container finished" podID="a3f6a144-b061-4cf3-9594-494f8d42c1c2" containerID="1e3dd1181a4d0c85a8215145de9dadf61e3dffef167b9bb1db012b1c735eb7d2" exitCode=0 Oct 02 12:31:20 crc kubenswrapper[4929]: I1002 12:31:20.641078 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"a3f6a144-b061-4cf3-9594-494f8d42c1c2","Type":"ContainerDied","Data":"1e3dd1181a4d0c85a8215145de9dadf61e3dffef167b9bb1db012b1c735eb7d2"} Oct 02 12:31:20 crc kubenswrapper[4929]: I1002 12:31:20.641122 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"a3f6a144-b061-4cf3-9594-494f8d42c1c2","Type":"ContainerStarted","Data":"71270fcf5cdc60e04700cfd454ea42e9bad792865190292585c4afecb2c52ced"} Oct 02 12:31:21 crc kubenswrapper[4929]: I1002 12:31:21.979675 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.020536 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_a3f6a144-b061-4cf3-9594-494f8d42c1c2/mariadb-client-6-default/0.log" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.032748 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw7xm\" (UniqueName: \"kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm\") pod \"a3f6a144-b061-4cf3-9594-494f8d42c1c2\" (UID: \"a3f6a144-b061-4cf3-9594-494f8d42c1c2\") " Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.043519 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm" (OuterVolumeSpecName: "kube-api-access-vw7xm") pod "a3f6a144-b061-4cf3-9594-494f8d42c1c2" (UID: "a3f6a144-b061-4cf3-9594-494f8d42c1c2"). InnerVolumeSpecName "kube-api-access-vw7xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.063902 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.069496 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.134168 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw7xm\" (UniqueName: \"kubernetes.io/projected/a3f6a144-b061-4cf3-9594-494f8d42c1c2-kube-api-access-vw7xm\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.164737 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f6a144-b061-4cf3-9594-494f8d42c1c2" path="/var/lib/kubelet/pods/a3f6a144-b061-4cf3-9594-494f8d42c1c2/volumes" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.204530 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:31:22 crc kubenswrapper[4929]: E1002 12:31:22.204937 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f6a144-b061-4cf3-9594-494f8d42c1c2" containerName="mariadb-client-6-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.204957 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f6a144-b061-4cf3-9594-494f8d42c1c2" containerName="mariadb-client-6-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.205227 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f6a144-b061-4cf3-9594-494f8d42c1c2" containerName="mariadb-client-6-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.207547 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.226064 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.235657 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqp4r\" (UniqueName: \"kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r\") pod \"mariadb-client-7-default\" (UID: \"79957a3b-290a-430d-8dfe-8c71239f7ed3\") " pod="openstack/mariadb-client-7-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.336915 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqp4r\" (UniqueName: \"kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r\") pod \"mariadb-client-7-default\" (UID: \"79957a3b-290a-430d-8dfe-8c71239f7ed3\") " pod="openstack/mariadb-client-7-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.364005 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqp4r\" (UniqueName: \"kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r\") pod \"mariadb-client-7-default\" (UID: \"79957a3b-290a-430d-8dfe-8c71239f7ed3\") " pod="openstack/mariadb-client-7-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.533528 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.657664 4929 scope.go:117] "RemoveContainer" containerID="1e3dd1181a4d0c85a8215145de9dadf61e3dffef167b9bb1db012b1c735eb7d2" Oct 02 12:31:22 crc kubenswrapper[4929]: I1002 12:31:22.657714 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:31:23 crc kubenswrapper[4929]: I1002 12:31:23.014889 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:31:23 crc kubenswrapper[4929]: I1002 12:31:23.670561 4929 generic.go:334] "Generic (PLEG): container finished" podID="79957a3b-290a-430d-8dfe-8c71239f7ed3" containerID="cd9a06682b761c4340da1c94f96f7152c3c128ad50bffffe92f4cb022c5a74ec" exitCode=0 Oct 02 12:31:23 crc kubenswrapper[4929]: I1002 12:31:23.670623 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"79957a3b-290a-430d-8dfe-8c71239f7ed3","Type":"ContainerDied","Data":"cd9a06682b761c4340da1c94f96f7152c3c128ad50bffffe92f4cb022c5a74ec"} Oct 02 12:31:23 crc kubenswrapper[4929]: I1002 12:31:23.670840 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"79957a3b-290a-430d-8dfe-8c71239f7ed3","Type":"ContainerStarted","Data":"3c316de27a3668b2cc9ccfc6bc38a39e0763774b99fbcb4519c5fde7969f2569"} Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.002094 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.019844 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_79957a3b-290a-430d-8dfe-8c71239f7ed3/mariadb-client-7-default/0.log" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.044594 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.051426 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.078271 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqp4r\" (UniqueName: \"kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r\") pod \"79957a3b-290a-430d-8dfe-8c71239f7ed3\" (UID: \"79957a3b-290a-430d-8dfe-8c71239f7ed3\") " Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.083333 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r" (OuterVolumeSpecName: "kube-api-access-vqp4r") pod "79957a3b-290a-430d-8dfe-8c71239f7ed3" (UID: "79957a3b-290a-430d-8dfe-8c71239f7ed3"). InnerVolumeSpecName "kube-api-access-vqp4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.179720 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:31:25 crc kubenswrapper[4929]: E1002 12:31:25.180146 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79957a3b-290a-430d-8dfe-8c71239f7ed3" containerName="mariadb-client-7-default" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.180170 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="79957a3b-290a-430d-8dfe-8c71239f7ed3" containerName="mariadb-client-7-default" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.180367 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="79957a3b-290a-430d-8dfe-8c71239f7ed3" containerName="mariadb-client-7-default" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.180364 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqp4r\" (UniqueName: \"kubernetes.io/projected/79957a3b-290a-430d-8dfe-8c71239f7ed3-kube-api-access-vqp4r\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.181199 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.196198 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.282007 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4nq8\" (UniqueName: \"kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8\") pod \"mariadb-client-2\" (UID: \"849567bb-735d-41a6-9220-9d2702f04c05\") " pod="openstack/mariadb-client-2" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.383646 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4nq8\" (UniqueName: \"kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8\") pod \"mariadb-client-2\" (UID: \"849567bb-735d-41a6-9220-9d2702f04c05\") " pod="openstack/mariadb-client-2" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.401005 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4nq8\" (UniqueName: \"kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8\") pod \"mariadb-client-2\" (UID: \"849567bb-735d-41a6-9220-9d2702f04c05\") " pod="openstack/mariadb-client-2" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.503949 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.685161 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c316de27a3668b2cc9ccfc6bc38a39e0763774b99fbcb4519c5fde7969f2569" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.685222 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:31:25 crc kubenswrapper[4929]: I1002 12:31:25.958638 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:31:26 crc kubenswrapper[4929]: I1002 12:31:26.165300 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79957a3b-290a-430d-8dfe-8c71239f7ed3" path="/var/lib/kubelet/pods/79957a3b-290a-430d-8dfe-8c71239f7ed3/volumes" Oct 02 12:31:26 crc kubenswrapper[4929]: I1002 12:31:26.693370 4929 generic.go:334] "Generic (PLEG): container finished" podID="849567bb-735d-41a6-9220-9d2702f04c05" containerID="9bcd70608068166610e1f5a8d03f985ffc2d37b988f2e83498939f29eb96f71b" exitCode=0 Oct 02 12:31:26 crc kubenswrapper[4929]: I1002 12:31:26.693416 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"849567bb-735d-41a6-9220-9d2702f04c05","Type":"ContainerDied","Data":"9bcd70608068166610e1f5a8d03f985ffc2d37b988f2e83498939f29eb96f71b"} Oct 02 12:31:26 crc kubenswrapper[4929]: I1002 12:31:26.693721 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"849567bb-735d-41a6-9220-9d2702f04c05","Type":"ContainerStarted","Data":"e40aaf5a03652640cda64ed138e9395a1ee60c2e4f90b7508dbabfb038a13ff1"} Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.120349 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.137929 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_849567bb-735d-41a6-9220-9d2702f04c05/mariadb-client-2/0.log" Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.169175 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.172255 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.259045 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4nq8\" (UniqueName: \"kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8\") pod \"849567bb-735d-41a6-9220-9d2702f04c05\" (UID: \"849567bb-735d-41a6-9220-9d2702f04c05\") " Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.265443 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8" (OuterVolumeSpecName: "kube-api-access-n4nq8") pod "849567bb-735d-41a6-9220-9d2702f04c05" (UID: "849567bb-735d-41a6-9220-9d2702f04c05"). InnerVolumeSpecName "kube-api-access-n4nq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.362518 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4nq8\" (UniqueName: \"kubernetes.io/projected/849567bb-735d-41a6-9220-9d2702f04c05-kube-api-access-n4nq8\") on node \"crc\" DevicePath \"\"" Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.709731 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e40aaf5a03652640cda64ed138e9395a1ee60c2e4f90b7508dbabfb038a13ff1" Oct 02 12:31:28 crc kubenswrapper[4929]: I1002 12:31:28.709779 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:31:30 crc kubenswrapper[4929]: I1002 12:31:30.168775 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849567bb-735d-41a6-9220-9d2702f04c05" path="/var/lib/kubelet/pods/849567bb-735d-41a6-9220-9d2702f04c05/volumes" Oct 02 12:32:15 crc kubenswrapper[4929]: I1002 12:32:15.491162 4929 scope.go:117] "RemoveContainer" containerID="45f6eb7291b6e2ef890e46bacd6f28605f41eb92866064098030f2dca1a66f2b" Oct 02 12:33:14 crc kubenswrapper[4929]: I1002 12:33:14.736506 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:33:14 crc kubenswrapper[4929]: I1002 12:33:14.737566 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:33:44 crc kubenswrapper[4929]: I1002 12:33:44.737230 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:33:44 crc kubenswrapper[4929]: I1002 12:33:44.737809 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.737186 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.737762 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.737803 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.738475 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.738535 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a" gracePeriod=600 Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.925812 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a" exitCode=0 Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.925893 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a"} Oct 02 12:34:14 crc kubenswrapper[4929]: I1002 12:34:14.926160 4929 scope.go:117] "RemoveContainer" containerID="bd77e6d122e4c625b8b34dbd1ad744ae943deb7a3bf92bfecb742cb37b227fa8" Oct 02 12:34:15 crc kubenswrapper[4929]: I1002 12:34:15.935490 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf"} Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.601830 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:35:56 crc kubenswrapper[4929]: E1002 12:35:56.602646 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849567bb-735d-41a6-9220-9d2702f04c05" containerName="mariadb-client-2" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.602660 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="849567bb-735d-41a6-9220-9d2702f04c05" containerName="mariadb-client-2" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.602826 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="849567bb-735d-41a6-9220-9d2702f04c05" containerName="mariadb-client-2" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.603989 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.613618 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.685804 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.685887 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrd9\" (UniqueName: \"kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.685969 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.787035 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.787106 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.787159 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrd9\" (UniqueName: \"kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.787740 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.787833 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.807374 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrd9\" (UniqueName: \"kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9\") pod \"redhat-operators-bvgtd\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:56 crc kubenswrapper[4929]: I1002 12:35:56.924579 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:35:57 crc kubenswrapper[4929]: I1002 12:35:57.390822 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:35:57 crc kubenswrapper[4929]: I1002 12:35:57.743865 4929 generic.go:334] "Generic (PLEG): container finished" podID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerID="80296a01a2f64a2f4824b4492df7f7c771b6d6b858170559ac9a3a94f3663f34" exitCode=0 Oct 02 12:35:57 crc kubenswrapper[4929]: I1002 12:35:57.743938 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerDied","Data":"80296a01a2f64a2f4824b4492df7f7c771b6d6b858170559ac9a3a94f3663f34"} Oct 02 12:35:57 crc kubenswrapper[4929]: I1002 12:35:57.744212 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerStarted","Data":"4bf151b7e22331d57b82fd3366b70c82aa241aca20210dfa6a11ce292e72b349"} Oct 02 12:35:57 crc kubenswrapper[4929]: I1002 12:35:57.745437 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:35:59 crc kubenswrapper[4929]: I1002 12:35:59.759531 4929 generic.go:334] "Generic (PLEG): container finished" podID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerID="1a4d857c7f8246001db2b0a39bd63439eba4d80e04e00b4df8b51a69ae3930b1" exitCode=0 Oct 02 12:35:59 crc kubenswrapper[4929]: I1002 12:35:59.759657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerDied","Data":"1a4d857c7f8246001db2b0a39bd63439eba4d80e04e00b4df8b51a69ae3930b1"} Oct 02 12:36:00 crc kubenswrapper[4929]: I1002 12:36:00.796633 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerStarted","Data":"b01b7108a6cf5eeecbd290c927af6df9e1d63a4f30ee4c86f9109b29b5122310"} Oct 02 12:36:00 crc kubenswrapper[4929]: I1002 12:36:00.816844 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bvgtd" podStartSLOduration=2.196017952 podStartE2EDuration="4.816824026s" podCreationTimestamp="2025-10-02 12:35:56 +0000 UTC" firstStartedPulling="2025-10-02 12:35:57.745198269 +0000 UTC m=+5158.295564633" lastFinishedPulling="2025-10-02 12:36:00.366004303 +0000 UTC m=+5160.916370707" observedRunningTime="2025-10-02 12:36:00.814239741 +0000 UTC m=+5161.364606105" watchObservedRunningTime="2025-10-02 12:36:00.816824026 +0000 UTC m=+5161.367190390" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.413008 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.414990 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.417412 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-x9n8v" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.423219 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.525833 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.526067 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7k46\" (UniqueName: \"kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.628111 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.628196 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7k46\" (UniqueName: \"kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.631779 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.631826 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0a7b5a96fcd1a4297bb523d79c6e7bc39cdd36f2f3cb6fc6893db40c800e7fb/globalmount\"" pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.648739 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7k46\" (UniqueName: \"kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.663356 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") pod \"mariadb-copy-data\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.736785 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.925604 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.926002 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:06 crc kubenswrapper[4929]: I1002 12:36:06.973909 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:07 crc kubenswrapper[4929]: I1002 12:36:07.203641 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:36:07 crc kubenswrapper[4929]: I1002 12:36:07.869194 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"15b5b57e-78a6-41a3-baed-a92c20bb06dd","Type":"ContainerStarted","Data":"80247cfb8ea20af64baec70686dd3ab20b2ad1b8f59c3081cd79d61d31387831"} Oct 02 12:36:07 crc kubenswrapper[4929]: I1002 12:36:07.869504 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"15b5b57e-78a6-41a3-baed-a92c20bb06dd","Type":"ContainerStarted","Data":"caa0dda9a9188d47a6e476d42c9e66addc22dc8020a7acf08721711163c1ff55"} Oct 02 12:36:07 crc kubenswrapper[4929]: I1002 12:36:07.884431 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.884413059 podStartE2EDuration="2.884413059s" podCreationTimestamp="2025-10-02 12:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:36:07.882081532 +0000 UTC m=+5168.432447906" watchObservedRunningTime="2025-10-02 12:36:07.884413059 +0000 UTC m=+5168.434779423" Oct 02 12:36:07 crc kubenswrapper[4929]: I1002 12:36:07.913200 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:08 crc kubenswrapper[4929]: I1002 12:36:08.209238 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:36:09 crc kubenswrapper[4929]: I1002 12:36:09.856293 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:09 crc kubenswrapper[4929]: I1002 12:36:09.857511 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:09 crc kubenswrapper[4929]: I1002 12:36:09.867979 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:09 crc kubenswrapper[4929]: I1002 12:36:09.884679 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bvgtd" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="registry-server" containerID="cri-o://b01b7108a6cf5eeecbd290c927af6df9e1d63a4f30ee4c86f9109b29b5122310" gracePeriod=2 Oct 02 12:36:09 crc kubenswrapper[4929]: I1002 12:36:09.975540 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzr9t\" (UniqueName: \"kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t\") pod \"mariadb-client\" (UID: \"5d982b88-109b-431c-9a4a-b38b2af76f8f\") " pod="openstack/mariadb-client" Oct 02 12:36:10 crc kubenswrapper[4929]: I1002 12:36:10.077679 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzr9t\" (UniqueName: \"kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t\") pod \"mariadb-client\" (UID: \"5d982b88-109b-431c-9a4a-b38b2af76f8f\") " pod="openstack/mariadb-client" Oct 02 12:36:10 crc kubenswrapper[4929]: I1002 12:36:10.104633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzr9t\" (UniqueName: \"kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t\") pod \"mariadb-client\" (UID: \"5d982b88-109b-431c-9a4a-b38b2af76f8f\") " pod="openstack/mariadb-client" Oct 02 12:36:10 crc kubenswrapper[4929]: I1002 12:36:10.177337 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:10 crc kubenswrapper[4929]: I1002 12:36:10.588739 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:10 crc kubenswrapper[4929]: W1002 12:36:10.599167 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d982b88_109b_431c_9a4a_b38b2af76f8f.slice/crio-4611cd5a27ef6946890ed84aac30998d1620ed54c5028de5728df2aed9341606 WatchSource:0}: Error finding container 4611cd5a27ef6946890ed84aac30998d1620ed54c5028de5728df2aed9341606: Status 404 returned error can't find the container with id 4611cd5a27ef6946890ed84aac30998d1620ed54c5028de5728df2aed9341606 Oct 02 12:36:10 crc kubenswrapper[4929]: I1002 12:36:10.896439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5d982b88-109b-431c-9a4a-b38b2af76f8f","Type":"ContainerStarted","Data":"4611cd5a27ef6946890ed84aac30998d1620ed54c5028de5728df2aed9341606"} Oct 02 12:36:11 crc kubenswrapper[4929]: I1002 12:36:11.906681 4929 generic.go:334] "Generic (PLEG): container finished" podID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerID="b01b7108a6cf5eeecbd290c927af6df9e1d63a4f30ee4c86f9109b29b5122310" exitCode=0 Oct 02 12:36:11 crc kubenswrapper[4929]: I1002 12:36:11.906736 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerDied","Data":"b01b7108a6cf5eeecbd290c927af6df9e1d63a4f30ee4c86f9109b29b5122310"} Oct 02 12:36:11 crc kubenswrapper[4929]: I1002 12:36:11.908208 4929 generic.go:334] "Generic (PLEG): container finished" podID="5d982b88-109b-431c-9a4a-b38b2af76f8f" containerID="da01a8a5b101c3a223ff6a90f1a0b0866f68f94729910139031dfe1079fdfafe" exitCode=0 Oct 02 12:36:11 crc kubenswrapper[4929]: I1002 12:36:11.908230 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5d982b88-109b-431c-9a4a-b38b2af76f8f","Type":"ContainerDied","Data":"da01a8a5b101c3a223ff6a90f1a0b0866f68f94729910139031dfe1079fdfafe"} Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.105632 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.224016 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzrd9\" (UniqueName: \"kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9\") pod \"c0775ae0-e2eb-4d3b-845c-60edce797418\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.224156 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities\") pod \"c0775ae0-e2eb-4d3b-845c-60edce797418\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.224201 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content\") pod \"c0775ae0-e2eb-4d3b-845c-60edce797418\" (UID: \"c0775ae0-e2eb-4d3b-845c-60edce797418\") " Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.225379 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities" (OuterVolumeSpecName: "utilities") pod "c0775ae0-e2eb-4d3b-845c-60edce797418" (UID: "c0775ae0-e2eb-4d3b-845c-60edce797418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.233323 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9" (OuterVolumeSpecName: "kube-api-access-rzrd9") pod "c0775ae0-e2eb-4d3b-845c-60edce797418" (UID: "c0775ae0-e2eb-4d3b-845c-60edce797418"). InnerVolumeSpecName "kube-api-access-rzrd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.305389 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0775ae0-e2eb-4d3b-845c-60edce797418" (UID: "c0775ae0-e2eb-4d3b-845c-60edce797418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.325892 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.325926 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0775ae0-e2eb-4d3b-845c-60edce797418-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.325939 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzrd9\" (UniqueName: \"kubernetes.io/projected/c0775ae0-e2eb-4d3b-845c-60edce797418-kube-api-access-rzrd9\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.918766 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bvgtd" event={"ID":"c0775ae0-e2eb-4d3b-845c-60edce797418","Type":"ContainerDied","Data":"4bf151b7e22331d57b82fd3366b70c82aa241aca20210dfa6a11ce292e72b349"} Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.918795 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bvgtd" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.918823 4929 scope.go:117] "RemoveContainer" containerID="b01b7108a6cf5eeecbd290c927af6df9e1d63a4f30ee4c86f9109b29b5122310" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.939531 4929 scope.go:117] "RemoveContainer" containerID="1a4d857c7f8246001db2b0a39bd63439eba4d80e04e00b4df8b51a69ae3930b1" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.965408 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.972402 4929 scope.go:117] "RemoveContainer" containerID="80296a01a2f64a2f4824b4492df7f7c771b6d6b858170559ac9a3a94f3663f34" Oct 02 12:36:12 crc kubenswrapper[4929]: I1002 12:36:12.973422 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bvgtd"] Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.196858 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.218013 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5d982b88-109b-431c-9a4a-b38b2af76f8f/mariadb-client/0.log" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.241180 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.249322 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.339072 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzr9t\" (UniqueName: \"kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t\") pod \"5d982b88-109b-431c-9a4a-b38b2af76f8f\" (UID: \"5d982b88-109b-431c-9a4a-b38b2af76f8f\") " Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.346192 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t" (OuterVolumeSpecName: "kube-api-access-qzr9t") pod "5d982b88-109b-431c-9a4a-b38b2af76f8f" (UID: "5d982b88-109b-431c-9a4a-b38b2af76f8f"). InnerVolumeSpecName "kube-api-access-qzr9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.362111 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:13 crc kubenswrapper[4929]: E1002 12:36:13.362570 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="registry-server" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.362641 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="registry-server" Oct 02 12:36:13 crc kubenswrapper[4929]: E1002 12:36:13.362671 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d982b88-109b-431c-9a4a-b38b2af76f8f" containerName="mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.362677 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d982b88-109b-431c-9a4a-b38b2af76f8f" containerName="mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: E1002 12:36:13.362690 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="extract-content" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.362699 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="extract-content" Oct 02 12:36:13 crc kubenswrapper[4929]: E1002 12:36:13.362712 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="extract-utilities" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.362718 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="extract-utilities" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.363010 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d982b88-109b-431c-9a4a-b38b2af76f8f" containerName="mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.363024 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" containerName="registry-server" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.363529 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.367650 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.440310 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzr9t\" (UniqueName: \"kubernetes.io/projected/5d982b88-109b-431c-9a4a-b38b2af76f8f-kube-api-access-qzr9t\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.541586 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8kv\" (UniqueName: \"kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv\") pod \"mariadb-client\" (UID: \"21d27f93-790c-47aa-83f9-4e3b0706c097\") " pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.642749 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp8kv\" (UniqueName: \"kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv\") pod \"mariadb-client\" (UID: \"21d27f93-790c-47aa-83f9-4e3b0706c097\") " pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.659112 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp8kv\" (UniqueName: \"kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv\") pod \"mariadb-client\" (UID: \"21d27f93-790c-47aa-83f9-4e3b0706c097\") " pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.686927 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.926364 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.926572 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4611cd5a27ef6946890ed84aac30998d1620ed54c5028de5728df2aed9341606" Oct 02 12:36:13 crc kubenswrapper[4929]: I1002 12:36:13.942275 4929 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="5d982b88-109b-431c-9a4a-b38b2af76f8f" podUID="21d27f93-790c-47aa-83f9-4e3b0706c097" Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.148251 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:14 crc kubenswrapper[4929]: W1002 12:36:14.152571 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d27f93_790c_47aa_83f9_4e3b0706c097.slice/crio-bc7b30cd41b71159196202343fc50806009d112b22c6feff0076217a3fe97221 WatchSource:0}: Error finding container bc7b30cd41b71159196202343fc50806009d112b22c6feff0076217a3fe97221: Status 404 returned error can't find the container with id bc7b30cd41b71159196202343fc50806009d112b22c6feff0076217a3fe97221 Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.165480 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d982b88-109b-431c-9a4a-b38b2af76f8f" path="/var/lib/kubelet/pods/5d982b88-109b-431c-9a4a-b38b2af76f8f/volumes" Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.165952 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0775ae0-e2eb-4d3b-845c-60edce797418" path="/var/lib/kubelet/pods/c0775ae0-e2eb-4d3b-845c-60edce797418/volumes" Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.938031 4929 generic.go:334] "Generic (PLEG): container finished" podID="21d27f93-790c-47aa-83f9-4e3b0706c097" containerID="6e4983a49eff6dd55026594ee3059f3c1b20d756e9676a2e7ca9fbb42c6c7058" exitCode=0 Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.938200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"21d27f93-790c-47aa-83f9-4e3b0706c097","Type":"ContainerDied","Data":"6e4983a49eff6dd55026594ee3059f3c1b20d756e9676a2e7ca9fbb42c6c7058"} Oct 02 12:36:14 crc kubenswrapper[4929]: I1002 12:36:14.938368 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"21d27f93-790c-47aa-83f9-4e3b0706c097","Type":"ContainerStarted","Data":"bc7b30cd41b71159196202343fc50806009d112b22c6feff0076217a3fe97221"} Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.302298 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.324869 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_21d27f93-790c-47aa-83f9-4e3b0706c097/mariadb-client/0.log" Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.355003 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.360833 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.488752 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp8kv\" (UniqueName: \"kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv\") pod \"21d27f93-790c-47aa-83f9-4e3b0706c097\" (UID: \"21d27f93-790c-47aa-83f9-4e3b0706c097\") " Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.494745 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv" (OuterVolumeSpecName: "kube-api-access-cp8kv") pod "21d27f93-790c-47aa-83f9-4e3b0706c097" (UID: "21d27f93-790c-47aa-83f9-4e3b0706c097"). InnerVolumeSpecName "kube-api-access-cp8kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.590937 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp8kv\" (UniqueName: \"kubernetes.io/projected/21d27f93-790c-47aa-83f9-4e3b0706c097-kube-api-access-cp8kv\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.959167 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7b30cd41b71159196202343fc50806009d112b22c6feff0076217a3fe97221" Oct 02 12:36:16 crc kubenswrapper[4929]: I1002 12:36:16.959272 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:36:18 crc kubenswrapper[4929]: I1002 12:36:18.176192 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d27f93-790c-47aa-83f9-4e3b0706c097" path="/var/lib/kubelet/pods/21d27f93-790c-47aa-83f9-4e3b0706c097/volumes" Oct 02 12:36:44 crc kubenswrapper[4929]: I1002 12:36:44.737053 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:36:44 crc kubenswrapper[4929]: I1002 12:36:44.737866 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.136349 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:36:59 crc kubenswrapper[4929]: E1002 12:36:59.137287 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d27f93-790c-47aa-83f9-4e3b0706c097" containerName="mariadb-client" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.137302 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d27f93-790c-47aa-83f9-4e3b0706c097" containerName="mariadb-client" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.137466 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d27f93-790c-47aa-83f9-4e3b0706c097" containerName="mariadb-client" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.139545 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.142450 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.143452 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.145931 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.146719 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zzbmh" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.154555 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.156521 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.180228 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.182162 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186243 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b72659fd-1823-4827-94a5-a39c03fa1479\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b72659fd-1823-4827-94a5-a39c03fa1479\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186316 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9j6r\" (UniqueName: \"kubernetes.io/projected/2f945dd1-560f-4013-b22c-bb99be50b29d-kube-api-access-j9j6r\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186375 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186520 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f945dd1-560f-4013-b22c-bb99be50b29d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186555 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f945dd1-560f-4013-b22c-bb99be50b29d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.186582 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.206238 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.212298 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.287945 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288007 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288055 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b72659fd-1823-4827-94a5-a39c03fa1479\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b72659fd-1823-4827-94a5-a39c03fa1479\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288077 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f68c411-ded7-456d-9c53-e763d401ec80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f68c411-ded7-456d-9c53-e763d401ec80\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288103 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9j6r\" (UniqueName: \"kubernetes.io/projected/2f945dd1-560f-4013-b22c-bb99be50b29d-kube-api-access-j9j6r\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288122 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288224 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8859d23-8436-449b-874a-722a6bc44f5c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288425 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8859d23-8436-449b-874a-722a6bc44f5c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288512 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f945dd1-560f-4013-b22c-bb99be50b29d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288551 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f945dd1-560f-4013-b22c-bb99be50b29d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flsm2\" (UniqueName: \"kubernetes.io/projected/e8859d23-8436-449b-874a-722a6bc44f5c-kube-api-access-flsm2\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288621 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.288952 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-config\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.289040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2f945dd1-560f-4013-b22c-bb99be50b29d-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.290087 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f945dd1-560f-4013-b22c-bb99be50b29d-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.291260 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.291285 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b72659fd-1823-4827-94a5-a39c03fa1479\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b72659fd-1823-4827-94a5-a39c03fa1479\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9185d7fc1aaa2ec46e19d062919bab94bc30cc23b6cde10a368618dc455790af/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.294847 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f945dd1-560f-4013-b22c-bb99be50b29d-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.313491 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9j6r\" (UniqueName: \"kubernetes.io/projected/2f945dd1-560f-4013-b22c-bb99be50b29d-kube-api-access-j9j6r\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.340640 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b72659fd-1823-4827-94a5-a39c03fa1479\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b72659fd-1823-4827-94a5-a39c03fa1479\") pod \"ovsdbserver-nb-2\" (UID: \"2f945dd1-560f-4013-b22c-bb99be50b29d\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.353286 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.365778 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.372736 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.373325 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.375646 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ln2kl" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.375690 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.386007 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.391087 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flsm2\" (UniqueName: \"kubernetes.io/projected/e8859d23-8436-449b-874a-722a6bc44f5c-kube-api-access-flsm2\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.391303 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.391352 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.391441 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.391562 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.396572 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397278 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397322 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-config\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397376 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f68c411-ded7-456d-9c53-e763d401ec80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f68c411-ded7-456d-9c53-e763d401ec80\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397440 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965kp\" (UniqueName: \"kubernetes.io/projected/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-kube-api-access-965kp\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397471 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8859d23-8436-449b-874a-722a6bc44f5c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397510 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.397627 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8859d23-8436-449b-874a-722a6bc44f5c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.398229 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8859d23-8436-449b-874a-722a6bc44f5c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.398300 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.400544 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8859d23-8436-449b-874a-722a6bc44f5c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.402101 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8859d23-8436-449b-874a-722a6bc44f5c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.403158 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.404510 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.410167 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.410198 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f68c411-ded7-456d-9c53-e763d401ec80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f68c411-ded7-456d-9c53-e763d401ec80\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a38dba0948a61fe7e9169657d892748f8386d8776282e55e37973a94175737a/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.410611 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.412893 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flsm2\" (UniqueName: \"kubernetes.io/projected/e8859d23-8436-449b-874a-722a6bc44f5c-kube-api-access-flsm2\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.420206 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.443617 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f68c411-ded7-456d-9c53-e763d401ec80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f68c411-ded7-456d-9c53-e763d401ec80\") pod \"ovsdbserver-nb-0\" (UID: \"e8859d23-8436-449b-874a-722a6bc44f5c\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.471634 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.483524 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.499742 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.500763 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.500937 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb37575-cc7e-4f62-ab42-32c31388ed7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501095 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-965kp\" (UniqueName: \"kubernetes.io/projected/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-kube-api-access-965kp\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501188 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bbb37575-cc7e-4f62-ab42-32c31388ed7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501269 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-config\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501350 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501398 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c37826e0-0523-4e9c-a506-afc38262b985-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501476 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/88692526-80f8-4a95-93d6-a6920288ddbf-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501631 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-config\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501719 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqbn\" (UniqueName: \"kubernetes.io/projected/c37826e0-0523-4e9c-a506-afc38262b985-kube-api-access-frqbn\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501911 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37826e0-0523-4e9c-a506-afc38262b985-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.501985 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxjn\" (UniqueName: \"kubernetes.io/projected/bbb37575-cc7e-4f62-ab42-32c31388ed7d-kube-api-access-7wxjn\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502091 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502161 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502202 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502248 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502293 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502346 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502388 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502422 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88692526-80f8-4a95-93d6-a6920288ddbf-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502453 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbh4k\" (UniqueName: \"kubernetes.io/projected/88692526-80f8-4a95-93d6-a6920288ddbf-kube-api-access-tbh4k\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502503 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.502537 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-config\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.504167 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.505253 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.506107 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-config\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.508659 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.508892 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.508946 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e18f49d1739f7ca806f716b9608be9b570d499a4f0bde88e4b16e42f8b2b638/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.524492 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-965kp\" (UniqueName: \"kubernetes.io/projected/4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f-kube-api-access-965kp\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.539827 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e665c1c5-be3c-46d4-8da8-8c17104968d5\") pod \"ovsdbserver-nb-1\" (UID: \"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.605973 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bbb37575-cc7e-4f62-ab42-32c31388ed7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606013 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-config\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606034 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c37826e0-0523-4e9c-a506-afc38262b985-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606051 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/88692526-80f8-4a95-93d6-a6920288ddbf-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606076 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-config\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606093 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqbn\" (UniqueName: \"kubernetes.io/projected/c37826e0-0523-4e9c-a506-afc38262b985-kube-api-access-frqbn\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606124 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37826e0-0523-4e9c-a506-afc38262b985-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606164 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxjn\" (UniqueName: \"kubernetes.io/projected/bbb37575-cc7e-4f62-ab42-32c31388ed7d-kube-api-access-7wxjn\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606224 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606282 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606308 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606329 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606345 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88692526-80f8-4a95-93d6-a6920288ddbf-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606361 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbh4k\" (UniqueName: \"kubernetes.io/projected/88692526-80f8-4a95-93d6-a6920288ddbf-kube-api-access-tbh4k\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606385 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606410 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.606428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb37575-cc7e-4f62-ab42-32c31388ed7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.607951 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.608238 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bbb37575-cc7e-4f62-ab42-32c31388ed7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.609036 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.609055 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/34aad44a2a796d4460e69cf89d30ef3e948e5adc97353d5e7466469635f3447e/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.609734 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/88692526-80f8-4a95-93d6-a6920288ddbf-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.609913 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c37826e0-0523-4e9c-a506-afc38262b985-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.609891 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-config\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.610279 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-config\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.610595 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.610618 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dddfab2582dc710c90917cfa343c3cf692487e5e7edb18fc7b1505652668095f/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.610822 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88692526-80f8-4a95-93d6-a6920288ddbf-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.611790 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c37826e0-0523-4e9c-a506-afc38262b985-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.612123 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bbb37575-cc7e-4f62-ab42-32c31388ed7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.613075 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.613104 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8589dc22404d1043805a0644d15d4dc2374b43235f8dc7f78cfeedf7df284d81/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.613283 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88692526-80f8-4a95-93d6-a6920288ddbf-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.621381 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37826e0-0523-4e9c-a506-afc38262b985-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.623641 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb37575-cc7e-4f62-ab42-32c31388ed7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.627034 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxjn\" (UniqueName: \"kubernetes.io/projected/bbb37575-cc7e-4f62-ab42-32c31388ed7d-kube-api-access-7wxjn\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.627743 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqbn\" (UniqueName: \"kubernetes.io/projected/c37826e0-0523-4e9c-a506-afc38262b985-kube-api-access-frqbn\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.632145 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbh4k\" (UniqueName: \"kubernetes.io/projected/88692526-80f8-4a95-93d6-a6920288ddbf-kube-api-access-tbh4k\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.639782 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-333f4d19-8498-4c8b-a0f4-8a1ba0bd5fdc\") pod \"ovsdbserver-sb-1\" (UID: \"c37826e0-0523-4e9c-a506-afc38262b985\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.647341 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15b67729-a5f4-4921-a0f4-33b90519b66a\") pod \"ovsdbserver-sb-2\" (UID: \"88692526-80f8-4a95-93d6-a6920288ddbf\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.649523 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6244d43-9be1-4df4-848b-be75ba44d9e7\") pod \"ovsdbserver-sb-0\" (UID: \"bbb37575-cc7e-4f62-ab42-32c31388ed7d\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.698947 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.770468 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.778150 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.805911 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 02 12:36:59 crc kubenswrapper[4929]: I1002 12:36:59.974796 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.073665 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:37:00 crc kubenswrapper[4929]: W1002 12:37:00.090651 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f945dd1_560f_4013_b22c_bb99be50b29d.slice/crio-a0d3870bdb0509aa9c83a524ae680f5acc41b84141a37d6af46228f95742f7ac WatchSource:0}: Error finding container a0d3870bdb0509aa9c83a524ae680f5acc41b84141a37d6af46228f95742f7ac: Status 404 returned error can't find the container with id a0d3870bdb0509aa9c83a524ae680f5acc41b84141a37d6af46228f95742f7ac Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.314223 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.327688 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8859d23-8436-449b-874a-722a6bc44f5c","Type":"ContainerStarted","Data":"01e115a87459eaf52d6a51f553964cc73255ef3c02afdbe0c129e2edfa53a266"} Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.348019 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f945dd1-560f-4013-b22c-bb99be50b29d","Type":"ContainerStarted","Data":"a0d3870bdb0509aa9c83a524ae680f5acc41b84141a37d6af46228f95742f7ac"} Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.418774 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:37:00 crc kubenswrapper[4929]: W1002 12:37:00.426995 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab1b12f_93dd_4a7b_9f77_ebbd7b53f32f.slice/crio-88ca6fe1e1231552e475358534f610fbe58734cd4eb700d68f9c2e789d224581 WatchSource:0}: Error finding container 88ca6fe1e1231552e475358534f610fbe58734cd4eb700d68f9c2e789d224581: Status 404 returned error can't find the container with id 88ca6fe1e1231552e475358534f610fbe58734cd4eb700d68f9c2e789d224581 Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.614545 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:37:00 crc kubenswrapper[4929]: W1002 12:37:00.622721 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb37575_cc7e_4f62_ab42_32c31388ed7d.slice/crio-3d92cef1d4412c74ced91757fa5b33c959a94df5f920277d2c0f2d47b07170bb WatchSource:0}: Error finding container 3d92cef1d4412c74ced91757fa5b33c959a94df5f920277d2c0f2d47b07170bb: Status 404 returned error can't find the container with id 3d92cef1d4412c74ced91757fa5b33c959a94df5f920277d2c0f2d47b07170bb Oct 02 12:37:00 crc kubenswrapper[4929]: I1002 12:37:00.964624 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:37:00 crc kubenswrapper[4929]: W1002 12:37:00.974615 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88692526_80f8_4a95_93d6_a6920288ddbf.slice/crio-bd3e7f4a37cab909cd6c315a24a15feda28478d72e09538e9d289f089f449681 WatchSource:0}: Error finding container bd3e7f4a37cab909cd6c315a24a15feda28478d72e09538e9d289f089f449681: Status 404 returned error can't find the container with id bd3e7f4a37cab909cd6c315a24a15feda28478d72e09538e9d289f089f449681 Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.369263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bbb37575-cc7e-4f62-ab42-32c31388ed7d","Type":"ContainerStarted","Data":"89a601192a7af2aaf338f44f96db3ddbcb6a975ba63a495d9992647d2b9f1f9c"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.369783 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bbb37575-cc7e-4f62-ab42-32c31388ed7d","Type":"ContainerStarted","Data":"3d92cef1d4412c74ced91757fa5b33c959a94df5f920277d2c0f2d47b07170bb"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.372517 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c37826e0-0523-4e9c-a506-afc38262b985","Type":"ContainerStarted","Data":"bd794f548bd2b3c8f512f578414cdf945de9ff821c74f7d130122b0c96e93bbb"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.372557 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c37826e0-0523-4e9c-a506-afc38262b985","Type":"ContainerStarted","Data":"961e1d0103f215120c3a5d038c4fc1f7ba899b245c60b8c3505116359a74cb8a"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.372576 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c37826e0-0523-4e9c-a506-afc38262b985","Type":"ContainerStarted","Data":"652f73f3b399de998ac29f70a65ac38ab370a91dc5f723842f350510f0b94919"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.375719 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8859d23-8436-449b-874a-722a6bc44f5c","Type":"ContainerStarted","Data":"e04073b3304570618a1e8c40667ee2aff5a76e7504c0a06e867e64b9bcd97206"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.375789 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8859d23-8436-449b-874a-722a6bc44f5c","Type":"ContainerStarted","Data":"3400bd1abf5d93f69bd740ce8b28daf35d62cc29ce76b79e8f362e38600f602e"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.377518 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"88692526-80f8-4a95-93d6-a6920288ddbf","Type":"ContainerStarted","Data":"bd3e7f4a37cab909cd6c315a24a15feda28478d72e09538e9d289f089f449681"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.380618 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f945dd1-560f-4013-b22c-bb99be50b29d","Type":"ContainerStarted","Data":"b654fe52eb28c83cb95c951517aea69a2e273eb59bbe2adf31c84e856e36b960"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.380649 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"2f945dd1-560f-4013-b22c-bb99be50b29d","Type":"ContainerStarted","Data":"c0e64e3d41f10e1297d56eba3535fec8d3af160dc27339324360c89e2969bc9e"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.382688 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f","Type":"ContainerStarted","Data":"dea77a25cc51f86b8d5bf3c9a76d513e566a85922699a44b818f090b06f044a4"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.382723 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f","Type":"ContainerStarted","Data":"aafc566d0b9a0f46e2d5bd483bdf911499b99a4f227e0042568c4d56773d5399"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.382736 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f","Type":"ContainerStarted","Data":"88ca6fe1e1231552e475358534f610fbe58734cd4eb700d68f9c2e789d224581"} Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.399521 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.3994939410000002 podStartE2EDuration="3.399493941s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:01.394130446 +0000 UTC m=+5221.944496820" watchObservedRunningTime="2025-10-02 12:37:01.399493941 +0000 UTC m=+5221.949860315" Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.424116 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.424085201 podStartE2EDuration="3.424085201s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:01.416390399 +0000 UTC m=+5221.966756753" watchObservedRunningTime="2025-10-02 12:37:01.424085201 +0000 UTC m=+5221.974451575" Oct 02 12:37:01 crc kubenswrapper[4929]: I1002 12:37:01.447663 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.447641591 podStartE2EDuration="3.447641591s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:01.442667338 +0000 UTC m=+5221.993033712" watchObservedRunningTime="2025-10-02 12:37:01.447641591 +0000 UTC m=+5221.998007965" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.394383 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"88692526-80f8-4a95-93d6-a6920288ddbf","Type":"ContainerStarted","Data":"ccdc8349c5aa7041b99f4b805482e402ad5ad7b29487f6876efe71b0df520964"} Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.394758 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"88692526-80f8-4a95-93d6-a6920288ddbf","Type":"ContainerStarted","Data":"86bb5d4559ced4c68aebb7ce0a170f5a76d258edf4ad6f16f384d58abd850172"} Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.396871 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bbb37575-cc7e-4f62-ab42-32c31388ed7d","Type":"ContainerStarted","Data":"6a7f23fee73eed528b5fcafb63e8deffe7448198280e63e69c5928cfe9a08a19"} Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.422072 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.422048838 podStartE2EDuration="4.422048838s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:01.467535316 +0000 UTC m=+5222.017901710" watchObservedRunningTime="2025-10-02 12:37:02.422048838 +0000 UTC m=+5222.972415202" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.423360 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.423349546 podStartE2EDuration="4.423349546s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:02.416527479 +0000 UTC m=+5222.966893923" watchObservedRunningTime="2025-10-02 12:37:02.423349546 +0000 UTC m=+5222.973715910" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.471681 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.483608 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.699939 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.771089 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 02 12:37:02 crc kubenswrapper[4929]: I1002 12:37:02.806869 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 02 12:37:03 crc kubenswrapper[4929]: I1002 12:37:03.426212 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=5.426194584 podStartE2EDuration="5.426194584s" podCreationTimestamp="2025-10-02 12:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:03.419506771 +0000 UTC m=+5223.969873135" watchObservedRunningTime="2025-10-02 12:37:03.426194584 +0000 UTC m=+5223.976560948" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.472455 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.483751 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.700184 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.771433 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.779379 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 02 12:37:04 crc kubenswrapper[4929]: I1002 12:37:04.807038 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.537417 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.544325 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.581493 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.593181 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.741448 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.772986 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.774263 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.777858 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.785114 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.791110 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.823629 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.827312 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.827731 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.827974 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswkz\" (UniqueName: \"kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.843139 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.864286 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.875253 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.897852 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.924977 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.929439 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.929488 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.929509 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.929556 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswkz\" (UniqueName: \"kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.930595 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.930708 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.930906 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:05 crc kubenswrapper[4929]: I1002 12:37:05.951366 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswkz\" (UniqueName: \"kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz\") pod \"dnsmasq-dns-d9f779dd5-wmrfl\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.116162 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.272131 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.291262 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.297907 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.302600 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.323713 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.440412 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.440479 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xt9s\" (UniqueName: \"kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.440520 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.440546 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.440607 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.526675 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.537637 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.541769 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.542084 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xt9s\" (UniqueName: \"kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.542138 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.542176 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.542431 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.544383 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.544800 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.547143 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.547388 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.595638 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xt9s\" (UniqueName: \"kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s\") pod \"dnsmasq-dns-d8c66857-xtjcc\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.650676 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:06 crc kubenswrapper[4929]: I1002 12:37:06.687362 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:06 crc kubenswrapper[4929]: W1002 12:37:06.706873 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30508385_dfea_4fa9_ad3c_3f595d9ace77.slice/crio-6c03f1064212d364d3020ed32604124a3f7a37189777f47f6134e1f134c03655 WatchSource:0}: Error finding container 6c03f1064212d364d3020ed32604124a3f7a37189777f47f6134e1f134c03655: Status 404 returned error can't find the container with id 6c03f1064212d364d3020ed32604124a3f7a37189777f47f6134e1f134c03655 Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.149611 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:07 crc kubenswrapper[4929]: W1002 12:37:07.158386 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c854aa2_1911_4558_a3cc_0b7f2954a16e.slice/crio-52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb WatchSource:0}: Error finding container 52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb: Status 404 returned error can't find the container with id 52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.452821 4929 generic.go:334] "Generic (PLEG): container finished" podID="30508385-dfea-4fa9-ad3c-3f595d9ace77" containerID="58c89004689f3f8286cb51fa0982a4cdf5965349031d86b2d2f80306d5d9108d" exitCode=0 Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.452876 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" event={"ID":"30508385-dfea-4fa9-ad3c-3f595d9ace77","Type":"ContainerDied","Data":"58c89004689f3f8286cb51fa0982a4cdf5965349031d86b2d2f80306d5d9108d"} Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.453009 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" event={"ID":"30508385-dfea-4fa9-ad3c-3f595d9ace77","Type":"ContainerStarted","Data":"6c03f1064212d364d3020ed32604124a3f7a37189777f47f6134e1f134c03655"} Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.459034 4929 generic.go:334] "Generic (PLEG): container finished" podID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerID="13c4330041e02e2d3710ba5e08cebadcfb218ad87466918f5702c4a5ab534e97" exitCode=0 Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.459328 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" event={"ID":"4c854aa2-1911-4558-a3cc-0b7f2954a16e","Type":"ContainerDied","Data":"13c4330041e02e2d3710ba5e08cebadcfb218ad87466918f5702c4a5ab534e97"} Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.459358 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" event={"ID":"4c854aa2-1911-4558-a3cc-0b7f2954a16e","Type":"ContainerStarted","Data":"52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb"} Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.720444 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.768567 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mswkz\" (UniqueName: \"kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz\") pod \"30508385-dfea-4fa9-ad3c-3f595d9ace77\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.768631 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config\") pod \"30508385-dfea-4fa9-ad3c-3f595d9ace77\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.768676 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb\") pod \"30508385-dfea-4fa9-ad3c-3f595d9ace77\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.768795 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc\") pod \"30508385-dfea-4fa9-ad3c-3f595d9ace77\" (UID: \"30508385-dfea-4fa9-ad3c-3f595d9ace77\") " Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.773391 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz" (OuterVolumeSpecName: "kube-api-access-mswkz") pod "30508385-dfea-4fa9-ad3c-3f595d9ace77" (UID: "30508385-dfea-4fa9-ad3c-3f595d9ace77"). InnerVolumeSpecName "kube-api-access-mswkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.788295 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30508385-dfea-4fa9-ad3c-3f595d9ace77" (UID: "30508385-dfea-4fa9-ad3c-3f595d9ace77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.789339 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config" (OuterVolumeSpecName: "config") pod "30508385-dfea-4fa9-ad3c-3f595d9ace77" (UID: "30508385-dfea-4fa9-ad3c-3f595d9ace77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.793863 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30508385-dfea-4fa9-ad3c-3f595d9ace77" (UID: "30508385-dfea-4fa9-ad3c-3f595d9ace77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.870598 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mswkz\" (UniqueName: \"kubernetes.io/projected/30508385-dfea-4fa9-ad3c-3f595d9ace77-kube-api-access-mswkz\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.870639 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.870650 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:07 crc kubenswrapper[4929]: I1002 12:37:07.870660 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30508385-dfea-4fa9-ad3c-3f595d9ace77-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.468755 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" event={"ID":"4c854aa2-1911-4558-a3cc-0b7f2954a16e","Type":"ContainerStarted","Data":"70dfc92cff4800037a88feb3dd0f80a1ac32805dd1374353bb74f71d693bc5c7"} Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.468911 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.470390 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" event={"ID":"30508385-dfea-4fa9-ad3c-3f595d9ace77","Type":"ContainerDied","Data":"6c03f1064212d364d3020ed32604124a3f7a37189777f47f6134e1f134c03655"} Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.470421 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9f779dd5-wmrfl" Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.470457 4929 scope.go:117] "RemoveContainer" containerID="58c89004689f3f8286cb51fa0982a4cdf5965349031d86b2d2f80306d5d9108d" Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.496124 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" podStartSLOduration=2.496103833 podStartE2EDuration="2.496103833s" podCreationTimestamp="2025-10-02 12:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:08.493162648 +0000 UTC m=+5229.043529012" watchObservedRunningTime="2025-10-02 12:37:08.496103833 +0000 UTC m=+5229.046470197" Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.531590 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:08 crc kubenswrapper[4929]: I1002 12:37:08.536939 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d9f779dd5-wmrfl"] Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.202201 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:37:09 crc kubenswrapper[4929]: E1002 12:37:09.202913 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30508385-dfea-4fa9-ad3c-3f595d9ace77" containerName="init" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.202931 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="30508385-dfea-4fa9-ad3c-3f595d9ace77" containerName="init" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.203212 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="30508385-dfea-4fa9-ad3c-3f595d9ace77" containerName="init" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.203889 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.210978 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.211499 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.300241 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.300510 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56ws\" (UniqueName: \"kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.300670 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.402232 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.402287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.402350 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56ws\" (UniqueName: \"kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.406307 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.406338 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f60f53905953d6d2f76d2e29e91e419f4a0e472450bf1ef099ba11cd18ff145b/globalmount\"" pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.414829 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.422728 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56ws\" (UniqueName: \"kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.445850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") pod \"ovn-copy-data\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.522995 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 12:37:09 crc kubenswrapper[4929]: I1002 12:37:09.983400 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:37:10 crc kubenswrapper[4929]: I1002 12:37:10.166661 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30508385-dfea-4fa9-ad3c-3f595d9ace77" path="/var/lib/kubelet/pods/30508385-dfea-4fa9-ad3c-3f595d9ace77/volumes" Oct 02 12:37:10 crc kubenswrapper[4929]: I1002 12:37:10.492846 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"598bbc9b-7eee-41ca-9078-7edd3464e2f9","Type":"ContainerStarted","Data":"46f5c01f67a15f7f058e412b7479965a4dcd2ef2221ebbf3be3e34db9c64577d"} Oct 02 12:37:10 crc kubenswrapper[4929]: I1002 12:37:10.492922 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"598bbc9b-7eee-41ca-9078-7edd3464e2f9","Type":"ContainerStarted","Data":"045bc22596b7e9773f74b1632387bdb6d0f6dc0e8a8baf0f82209f7dd291aeea"} Oct 02 12:37:10 crc kubenswrapper[4929]: I1002 12:37:10.514607 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.514588069 podStartE2EDuration="2.514588069s" podCreationTimestamp="2025-10-02 12:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:10.50839578 +0000 UTC m=+5231.058762154" watchObservedRunningTime="2025-10-02 12:37:10.514588069 +0000 UTC m=+5231.064954433" Oct 02 12:37:14 crc kubenswrapper[4929]: I1002 12:37:14.736884 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:37:14 crc kubenswrapper[4929]: I1002 12:37:14.737837 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.152722 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.154319 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.157193 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.157294 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.157447 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kvzhg" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.170377 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.201243 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-config\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.201625 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f6a797f-1ea1-430f-8898-c10fa43d597f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.201734 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-scripts\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.201872 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6a797f-1ea1-430f-8898-c10fa43d597f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.201973 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmml\" (UniqueName: \"kubernetes.io/projected/4f6a797f-1ea1-430f-8898-c10fa43d597f-kube-api-access-zfmml\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.303148 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6a797f-1ea1-430f-8898-c10fa43d597f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.303232 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmml\" (UniqueName: \"kubernetes.io/projected/4f6a797f-1ea1-430f-8898-c10fa43d597f-kube-api-access-zfmml\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.303299 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-config\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.303360 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f6a797f-1ea1-430f-8898-c10fa43d597f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.303390 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-scripts\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.304390 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f6a797f-1ea1-430f-8898-c10fa43d597f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.304659 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-config\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.304668 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f6a797f-1ea1-430f-8898-c10fa43d597f-scripts\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.311340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6a797f-1ea1-430f-8898-c10fa43d597f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.326496 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmml\" (UniqueName: \"kubernetes.io/projected/4f6a797f-1ea1-430f-8898-c10fa43d597f-kube-api-access-zfmml\") pod \"ovn-northd-0\" (UID: \"4f6a797f-1ea1-430f-8898-c10fa43d597f\") " pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.476507 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.616842 4929 scope.go:117] "RemoveContainer" containerID="eaee440912839b3b26d154dec2e2d1171744c2e24e658fd15370533bb6976416" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.655362 4929 scope.go:117] "RemoveContainer" containerID="e8fe31d0f1d66080d6e74715db9503181b2f6d36ac5a75fa94cbc8a84101b642" Oct 02 12:37:15 crc kubenswrapper[4929]: I1002 12:37:15.924835 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:37:15 crc kubenswrapper[4929]: W1002 12:37:15.929156 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6a797f_1ea1_430f_8898_c10fa43d597f.slice/crio-9e3443ea2667f0d57fa18f443357a5a99f4aab11a9a4361057a286ead48ae22b WatchSource:0}: Error finding container 9e3443ea2667f0d57fa18f443357a5a99f4aab11a9a4361057a286ead48ae22b: Status 404 returned error can't find the container with id 9e3443ea2667f0d57fa18f443357a5a99f4aab11a9a4361057a286ead48ae22b Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.544257 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f6a797f-1ea1-430f-8898-c10fa43d597f","Type":"ContainerStarted","Data":"a1b50adc3416605d78c6c2e5f7706436ebf1f51e4c56b19f47b1e38624cd6b14"} Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.544584 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f6a797f-1ea1-430f-8898-c10fa43d597f","Type":"ContainerStarted","Data":"b96f9ae2102251e74d51ef5aaf2f63ffc1c83eb0c16e2759c31d9c638f57bc3a"} Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.544601 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.544611 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4f6a797f-1ea1-430f-8898-c10fa43d597f","Type":"ContainerStarted","Data":"9e3443ea2667f0d57fa18f443357a5a99f4aab11a9a4361057a286ead48ae22b"} Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.567296 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.567258545 podStartE2EDuration="1.567258545s" podCreationTimestamp="2025-10-02 12:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:16.558148242 +0000 UTC m=+5237.108514616" watchObservedRunningTime="2025-10-02 12:37:16.567258545 +0000 UTC m=+5237.117624909" Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.652123 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.704629 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:37:16 crc kubenswrapper[4929]: I1002 12:37:16.705097 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="dnsmasq-dns" containerID="cri-o://266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8" gracePeriod=10 Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.142573 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.238498 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc\") pod \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.238941 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctr5j\" (UniqueName: \"kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j\") pod \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.239216 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config\") pod \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\" (UID: \"b70b4c24-a40c-4f07-b87f-e129132e5f2e\") " Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.245321 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j" (OuterVolumeSpecName: "kube-api-access-ctr5j") pod "b70b4c24-a40c-4f07-b87f-e129132e5f2e" (UID: "b70b4c24-a40c-4f07-b87f-e129132e5f2e"). InnerVolumeSpecName "kube-api-access-ctr5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.282720 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b70b4c24-a40c-4f07-b87f-e129132e5f2e" (UID: "b70b4c24-a40c-4f07-b87f-e129132e5f2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.282761 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config" (OuterVolumeSpecName: "config") pod "b70b4c24-a40c-4f07-b87f-e129132e5f2e" (UID: "b70b4c24-a40c-4f07-b87f-e129132e5f2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.341528 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctr5j\" (UniqueName: \"kubernetes.io/projected/b70b4c24-a40c-4f07-b87f-e129132e5f2e-kube-api-access-ctr5j\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.341567 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.341580 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b70b4c24-a40c-4f07-b87f-e129132e5f2e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.555105 4929 generic.go:334] "Generic (PLEG): container finished" podID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerID="266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8" exitCode=0 Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.555154 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" event={"ID":"b70b4c24-a40c-4f07-b87f-e129132e5f2e","Type":"ContainerDied","Data":"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8"} Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.555170 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.555200 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vwmcj" event={"ID":"b70b4c24-a40c-4f07-b87f-e129132e5f2e","Type":"ContainerDied","Data":"38b0d1c2886e10677d29ffe083e839e3447bf0be691984092689855716305ea7"} Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.555219 4929 scope.go:117] "RemoveContainer" containerID="266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.577469 4929 scope.go:117] "RemoveContainer" containerID="e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.586071 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.595405 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vwmcj"] Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.602372 4929 scope.go:117] "RemoveContainer" containerID="266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8" Oct 02 12:37:17 crc kubenswrapper[4929]: E1002 12:37:17.602878 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8\": container with ID starting with 266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8 not found: ID does not exist" containerID="266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.602921 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8"} err="failed to get container status \"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8\": rpc error: code = NotFound desc = could not find container \"266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8\": container with ID starting with 266fc373e69b4dfce14ab98951593fcbdde51fef6c30458ebd54fd261005fcf8 not found: ID does not exist" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.602950 4929 scope.go:117] "RemoveContainer" containerID="e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e" Oct 02 12:37:17 crc kubenswrapper[4929]: E1002 12:37:17.603436 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e\": container with ID starting with e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e not found: ID does not exist" containerID="e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e" Oct 02 12:37:17 crc kubenswrapper[4929]: I1002 12:37:17.603475 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e"} err="failed to get container status \"e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e\": rpc error: code = NotFound desc = could not find container \"e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e\": container with ID starting with e0569379b1d086e588a73334704854bab04517b74006e3a54d837d277b9a411e not found: ID does not exist" Oct 02 12:37:18 crc kubenswrapper[4929]: I1002 12:37:18.172649 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" path="/var/lib/kubelet/pods/b70b4c24-a40c-4f07-b87f-e129132e5f2e/volumes" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.085090 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-n7cpg"] Oct 02 12:37:20 crc kubenswrapper[4929]: E1002 12:37:20.085438 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="dnsmasq-dns" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.085451 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="dnsmasq-dns" Oct 02 12:37:20 crc kubenswrapper[4929]: E1002 12:37:20.085474 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="init" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.085480 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="init" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.085654 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70b4c24-a40c-4f07-b87f-e129132e5f2e" containerName="dnsmasq-dns" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.086211 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.095294 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n7cpg"] Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.184599 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shc7d\" (UniqueName: \"kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d\") pod \"keystone-db-create-n7cpg\" (UID: \"0a1da2e6-cf33-4709-a04f-6749a089638b\") " pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.285747 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shc7d\" (UniqueName: \"kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d\") pod \"keystone-db-create-n7cpg\" (UID: \"0a1da2e6-cf33-4709-a04f-6749a089638b\") " pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.309927 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shc7d\" (UniqueName: \"kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d\") pod \"keystone-db-create-n7cpg\" (UID: \"0a1da2e6-cf33-4709-a04f-6749a089638b\") " pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.419792 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:20 crc kubenswrapper[4929]: I1002 12:37:20.833927 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-n7cpg"] Oct 02 12:37:21 crc kubenswrapper[4929]: I1002 12:37:21.587031 4929 generic.go:334] "Generic (PLEG): container finished" podID="0a1da2e6-cf33-4709-a04f-6749a089638b" containerID="5ab0eaa73d287fd2f53824262aafaf7dd0296e0b449b7fc021a08fe107cce0ce" exitCode=0 Oct 02 12:37:21 crc kubenswrapper[4929]: I1002 12:37:21.587115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7cpg" event={"ID":"0a1da2e6-cf33-4709-a04f-6749a089638b","Type":"ContainerDied","Data":"5ab0eaa73d287fd2f53824262aafaf7dd0296e0b449b7fc021a08fe107cce0ce"} Oct 02 12:37:21 crc kubenswrapper[4929]: I1002 12:37:21.587405 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7cpg" event={"ID":"0a1da2e6-cf33-4709-a04f-6749a089638b","Type":"ContainerStarted","Data":"379cf3d3a19d805a6c4a8f5c73a4bde3dc6f20245822c53896ca8ea6e7ef8076"} Oct 02 12:37:22 crc kubenswrapper[4929]: I1002 12:37:22.926083 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.026827 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shc7d\" (UniqueName: \"kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d\") pod \"0a1da2e6-cf33-4709-a04f-6749a089638b\" (UID: \"0a1da2e6-cf33-4709-a04f-6749a089638b\") " Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.032990 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d" (OuterVolumeSpecName: "kube-api-access-shc7d") pod "0a1da2e6-cf33-4709-a04f-6749a089638b" (UID: "0a1da2e6-cf33-4709-a04f-6749a089638b"). InnerVolumeSpecName "kube-api-access-shc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.128689 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shc7d\" (UniqueName: \"kubernetes.io/projected/0a1da2e6-cf33-4709-a04f-6749a089638b-kube-api-access-shc7d\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.603975 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-n7cpg" event={"ID":"0a1da2e6-cf33-4709-a04f-6749a089638b","Type":"ContainerDied","Data":"379cf3d3a19d805a6c4a8f5c73a4bde3dc6f20245822c53896ca8ea6e7ef8076"} Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.604018 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379cf3d3a19d805a6c4a8f5c73a4bde3dc6f20245822c53896ca8ea6e7ef8076" Oct 02 12:37:23 crc kubenswrapper[4929]: I1002 12:37:23.604105 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-n7cpg" Oct 02 12:37:23 crc kubenswrapper[4929]: E1002 12:37:23.718397 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a1da2e6_cf33_4709_a04f_6749a089638b.slice/crio-379cf3d3a19d805a6c4a8f5c73a4bde3dc6f20245822c53896ca8ea6e7ef8076\": RecentStats: unable to find data in memory cache]" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.180510 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bf22-account-create-thfq5"] Oct 02 12:37:30 crc kubenswrapper[4929]: E1002 12:37:30.182107 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1da2e6-cf33-4709-a04f-6749a089638b" containerName="mariadb-database-create" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.182121 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1da2e6-cf33-4709-a04f-6749a089638b" containerName="mariadb-database-create" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.182621 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1da2e6-cf33-4709-a04f-6749a089638b" containerName="mariadb-database-create" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.183617 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.186104 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.200742 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf22-account-create-thfq5"] Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.351967 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdpt5\" (UniqueName: \"kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5\") pod \"keystone-bf22-account-create-thfq5\" (UID: \"877dbf06-5427-494f-bc8a-529830e2180e\") " pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.453406 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdpt5\" (UniqueName: \"kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5\") pod \"keystone-bf22-account-create-thfq5\" (UID: \"877dbf06-5427-494f-bc8a-529830e2180e\") " pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.474599 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdpt5\" (UniqueName: \"kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5\") pod \"keystone-bf22-account-create-thfq5\" (UID: \"877dbf06-5427-494f-bc8a-529830e2180e\") " pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.509345 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.537820 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 12:37:30 crc kubenswrapper[4929]: I1002 12:37:30.950477 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf22-account-create-thfq5"] Oct 02 12:37:30 crc kubenswrapper[4929]: W1002 12:37:30.953713 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877dbf06_5427_494f_bc8a_529830e2180e.slice/crio-3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb WatchSource:0}: Error finding container 3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb: Status 404 returned error can't find the container with id 3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb Oct 02 12:37:31 crc kubenswrapper[4929]: I1002 12:37:31.673660 4929 generic.go:334] "Generic (PLEG): container finished" podID="877dbf06-5427-494f-bc8a-529830e2180e" containerID="8f7d43c71d26b46dbe717daba829cf49f424f8e3d3843e4840dcc24fea3e2375" exitCode=0 Oct 02 12:37:31 crc kubenswrapper[4929]: I1002 12:37:31.673726 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf22-account-create-thfq5" event={"ID":"877dbf06-5427-494f-bc8a-529830e2180e","Type":"ContainerDied","Data":"8f7d43c71d26b46dbe717daba829cf49f424f8e3d3843e4840dcc24fea3e2375"} Oct 02 12:37:31 crc kubenswrapper[4929]: I1002 12:37:31.674046 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf22-account-create-thfq5" event={"ID":"877dbf06-5427-494f-bc8a-529830e2180e","Type":"ContainerStarted","Data":"3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb"} Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.014228 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.198828 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdpt5\" (UniqueName: \"kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5\") pod \"877dbf06-5427-494f-bc8a-529830e2180e\" (UID: \"877dbf06-5427-494f-bc8a-529830e2180e\") " Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.203535 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5" (OuterVolumeSpecName: "kube-api-access-gdpt5") pod "877dbf06-5427-494f-bc8a-529830e2180e" (UID: "877dbf06-5427-494f-bc8a-529830e2180e"). InnerVolumeSpecName "kube-api-access-gdpt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.300984 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdpt5\" (UniqueName: \"kubernetes.io/projected/877dbf06-5427-494f-bc8a-529830e2180e-kube-api-access-gdpt5\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.692764 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf22-account-create-thfq5" event={"ID":"877dbf06-5427-494f-bc8a-529830e2180e","Type":"ContainerDied","Data":"3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb"} Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.692810 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf22-account-create-thfq5" Oct 02 12:37:33 crc kubenswrapper[4929]: I1002 12:37:33.692821 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ae58bf21e7236517fc3dcb1aac3bd160cf589f5f1caa8d92281ee88a96500bb" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.638286 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-h69g2"] Oct 02 12:37:35 crc kubenswrapper[4929]: E1002 12:37:35.639130 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877dbf06-5427-494f-bc8a-529830e2180e" containerName="mariadb-account-create" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.639149 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="877dbf06-5427-494f-bc8a-529830e2180e" containerName="mariadb-account-create" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.639373 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="877dbf06-5427-494f-bc8a-529830e2180e" containerName="mariadb-account-create" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.640050 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.642122 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.642388 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.642542 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.647919 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mwwg9" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.654065 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h69g2"] Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.740597 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.741025 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6876\" (UniqueName: \"kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.741250 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.850051 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6876\" (UniqueName: \"kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.850159 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.850349 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.861264 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.863440 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.880033 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6876\" (UniqueName: \"kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876\") pod \"keystone-db-sync-h69g2\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:35 crc kubenswrapper[4929]: I1002 12:37:35.957531 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:36 crc kubenswrapper[4929]: I1002 12:37:36.429358 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h69g2"] Oct 02 12:37:36 crc kubenswrapper[4929]: I1002 12:37:36.718512 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h69g2" event={"ID":"8c8670e7-a468-4168-97fd-5bf24b827746","Type":"ContainerStarted","Data":"b9503834e331999ded2167675da7403e8f1625e1044dbdbf60123c822b59455d"} Oct 02 12:37:36 crc kubenswrapper[4929]: I1002 12:37:36.718801 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h69g2" event={"ID":"8c8670e7-a468-4168-97fd-5bf24b827746","Type":"ContainerStarted","Data":"8f6db11b685feb0640ede4e7a712609ab66cf4bc0547c70c6f2087f6cbfac8d8"} Oct 02 12:37:36 crc kubenswrapper[4929]: I1002 12:37:36.739224 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-h69g2" podStartSLOduration=1.739208425 podStartE2EDuration="1.739208425s" podCreationTimestamp="2025-10-02 12:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:36.736482507 +0000 UTC m=+5257.286848871" watchObservedRunningTime="2025-10-02 12:37:36.739208425 +0000 UTC m=+5257.289574789" Oct 02 12:37:38 crc kubenswrapper[4929]: I1002 12:37:38.736500 4929 generic.go:334] "Generic (PLEG): container finished" podID="8c8670e7-a468-4168-97fd-5bf24b827746" containerID="b9503834e331999ded2167675da7403e8f1625e1044dbdbf60123c822b59455d" exitCode=0 Oct 02 12:37:38 crc kubenswrapper[4929]: I1002 12:37:38.736605 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h69g2" event={"ID":"8c8670e7-a468-4168-97fd-5bf24b827746","Type":"ContainerDied","Data":"b9503834e331999ded2167675da7403e8f1625e1044dbdbf60123c822b59455d"} Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.054809 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.238980 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle\") pod \"8c8670e7-a468-4168-97fd-5bf24b827746\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.239049 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data\") pod \"8c8670e7-a468-4168-97fd-5bf24b827746\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.239078 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6876\" (UniqueName: \"kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876\") pod \"8c8670e7-a468-4168-97fd-5bf24b827746\" (UID: \"8c8670e7-a468-4168-97fd-5bf24b827746\") " Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.250259 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876" (OuterVolumeSpecName: "kube-api-access-c6876") pod "8c8670e7-a468-4168-97fd-5bf24b827746" (UID: "8c8670e7-a468-4168-97fd-5bf24b827746"). InnerVolumeSpecName "kube-api-access-c6876". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.260917 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8670e7-a468-4168-97fd-5bf24b827746" (UID: "8c8670e7-a468-4168-97fd-5bf24b827746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.275906 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data" (OuterVolumeSpecName: "config-data") pod "8c8670e7-a468-4168-97fd-5bf24b827746" (UID: "8c8670e7-a468-4168-97fd-5bf24b827746"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.340470 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.340503 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8670e7-a468-4168-97fd-5bf24b827746-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.340516 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6876\" (UniqueName: \"kubernetes.io/projected/8c8670e7-a468-4168-97fd-5bf24b827746-kube-api-access-c6876\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.752736 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h69g2" event={"ID":"8c8670e7-a468-4168-97fd-5bf24b827746","Type":"ContainerDied","Data":"8f6db11b685feb0640ede4e7a712609ab66cf4bc0547c70c6f2087f6cbfac8d8"} Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.752781 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f6db11b685feb0640ede4e7a712609ab66cf4bc0547c70c6f2087f6cbfac8d8" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.752823 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h69g2" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.970537 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jb5mh"] Oct 02 12:37:40 crc kubenswrapper[4929]: E1002 12:37:40.971606 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8670e7-a468-4168-97fd-5bf24b827746" containerName="keystone-db-sync" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.971634 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8670e7-a468-4168-97fd-5bf24b827746" containerName="keystone-db-sync" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.995354 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8670e7-a468-4168-97fd-5bf24b827746" containerName="keystone-db-sync" Oct 02 12:37:40 crc kubenswrapper[4929]: I1002 12:37:40.996593 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.008360 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mwwg9" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.008743 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.008837 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.008859 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.017502 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jb5mh"] Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.031421 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.032860 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.039150 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157014 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157052 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157173 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157207 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjsr\" (UniqueName: \"kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157284 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157299 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157315 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h86cv\" (UniqueName: \"kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157389 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157415 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157445 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.157469 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258467 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258535 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258591 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258616 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjsr\" (UniqueName: \"kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258658 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258675 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258688 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h86cv\" (UniqueName: \"kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258736 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258755 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.258776 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.259675 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.260702 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.261428 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.261842 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.263340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.263371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.263875 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.276877 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.277397 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.279257 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjsr\" (UniqueName: \"kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr\") pod \"dnsmasq-dns-9fc44c7cc-h4xmp\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.285474 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h86cv\" (UniqueName: \"kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv\") pod \"keystone-bootstrap-jb5mh\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.335636 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.352188 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.808214 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:37:41 crc kubenswrapper[4929]: I1002 12:37:41.815724 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jb5mh"] Oct 02 12:37:41 crc kubenswrapper[4929]: W1002 12:37:41.822884 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa5d218a_e650_4e97_bee7_e3f2d211606d.slice/crio-7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58 WatchSource:0}: Error finding container 7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58: Status 404 returned error can't find the container with id 7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58 Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.768139 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jb5mh" event={"ID":"aa5d218a-e650-4e97-bee7-e3f2d211606d","Type":"ContainerStarted","Data":"b738598602d618870e97f8380351e0ec26acef0127740d0fda55c76890576d9b"} Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.768540 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jb5mh" event={"ID":"aa5d218a-e650-4e97-bee7-e3f2d211606d","Type":"ContainerStarted","Data":"7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58"} Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.771455 4929 generic.go:334] "Generic (PLEG): container finished" podID="d52cea74-0895-4412-8a20-d0ac843411b8" containerID="2455f56c6c5085ef02d2b8650b39d2e4e73d81e6e7a93cd6da4b383d38033817" exitCode=0 Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.771514 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" event={"ID":"d52cea74-0895-4412-8a20-d0ac843411b8","Type":"ContainerDied","Data":"2455f56c6c5085ef02d2b8650b39d2e4e73d81e6e7a93cd6da4b383d38033817"} Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.771548 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" event={"ID":"d52cea74-0895-4412-8a20-d0ac843411b8","Type":"ContainerStarted","Data":"91669054db15f729609ef052246e02a00d079020cb4ac3893e79ac3fc84975e7"} Oct 02 12:37:42 crc kubenswrapper[4929]: I1002 12:37:42.786009 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jb5mh" podStartSLOduration=2.785988182 podStartE2EDuration="2.785988182s" podCreationTimestamp="2025-10-02 12:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:42.784619052 +0000 UTC m=+5263.334985426" watchObservedRunningTime="2025-10-02 12:37:42.785988182 +0000 UTC m=+5263.336354546" Oct 02 12:37:43 crc kubenswrapper[4929]: I1002 12:37:43.779002 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" event={"ID":"d52cea74-0895-4412-8a20-d0ac843411b8","Type":"ContainerStarted","Data":"125168a5c86e5f696d2bf43ac4f999f364600bdb821bedcc6139c8ad9c0be647"} Oct 02 12:37:43 crc kubenswrapper[4929]: I1002 12:37:43.799583 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" podStartSLOduration=3.79955949 podStartE2EDuration="3.79955949s" podCreationTimestamp="2025-10-02 12:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:43.795688248 +0000 UTC m=+5264.346054612" watchObservedRunningTime="2025-10-02 12:37:43.79955949 +0000 UTC m=+5264.349925864" Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.737072 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.737414 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.737456 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.738183 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.738241 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" gracePeriod=600 Oct 02 12:37:44 crc kubenswrapper[4929]: I1002 12:37:44.785633 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:45 crc kubenswrapper[4929]: E1002 12:37:45.037508 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:37:45 crc kubenswrapper[4929]: I1002 12:37:45.797061 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" exitCode=0 Oct 02 12:37:45 crc kubenswrapper[4929]: I1002 12:37:45.797099 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf"} Oct 02 12:37:45 crc kubenswrapper[4929]: I1002 12:37:45.797434 4929 scope.go:117] "RemoveContainer" containerID="044cd82883d551890fa7db1c4b98cedddb42bd10a62b8c9f6662e1a7e915441a" Oct 02 12:37:45 crc kubenswrapper[4929]: I1002 12:37:45.798288 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:37:45 crc kubenswrapper[4929]: E1002 12:37:45.798663 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:37:46 crc kubenswrapper[4929]: I1002 12:37:46.805997 4929 generic.go:334] "Generic (PLEG): container finished" podID="aa5d218a-e650-4e97-bee7-e3f2d211606d" containerID="b738598602d618870e97f8380351e0ec26acef0127740d0fda55c76890576d9b" exitCode=0 Oct 02 12:37:46 crc kubenswrapper[4929]: I1002 12:37:46.806065 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jb5mh" event={"ID":"aa5d218a-e650-4e97-bee7-e3f2d211606d","Type":"ContainerDied","Data":"b738598602d618870e97f8380351e0ec26acef0127740d0fda55c76890576d9b"} Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.176462 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.271855 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.271917 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.271939 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h86cv\" (UniqueName: \"kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.272103 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.272148 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.272191 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts\") pod \"aa5d218a-e650-4e97-bee7-e3f2d211606d\" (UID: \"aa5d218a-e650-4e97-bee7-e3f2d211606d\") " Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.277632 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv" (OuterVolumeSpecName: "kube-api-access-h86cv") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "kube-api-access-h86cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.278387 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts" (OuterVolumeSpecName: "scripts") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.278412 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.279464 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.295750 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data" (OuterVolumeSpecName: "config-data") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.297791 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa5d218a-e650-4e97-bee7-e3f2d211606d" (UID: "aa5d218a-e650-4e97-bee7-e3f2d211606d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374460 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374510 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374527 4929 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374538 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h86cv\" (UniqueName: \"kubernetes.io/projected/aa5d218a-e650-4e97-bee7-e3f2d211606d-kube-api-access-h86cv\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374551 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.374563 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/aa5d218a-e650-4e97-bee7-e3f2d211606d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.826827 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jb5mh" event={"ID":"aa5d218a-e650-4e97-bee7-e3f2d211606d","Type":"ContainerDied","Data":"7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58"} Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.826867 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ed8ebe120519ff3b452e2127dee56c0ff17c1a67271e392f665303a47648b58" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.826877 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jb5mh" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.888301 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jb5mh"] Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.899052 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jb5mh"] Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.998773 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f2v2l"] Oct 02 12:37:48 crc kubenswrapper[4929]: E1002 12:37:48.999880 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d218a-e650-4e97-bee7-e3f2d211606d" containerName="keystone-bootstrap" Oct 02 12:37:48 crc kubenswrapper[4929]: I1002 12:37:48.999909 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d218a-e650-4e97-bee7-e3f2d211606d" containerName="keystone-bootstrap" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.001052 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5d218a-e650-4e97-bee7-e3f2d211606d" containerName="keystone-bootstrap" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.005305 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.008309 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.014738 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.015135 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mwwg9" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.015298 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.043112 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f2v2l"] Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.188777 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.188857 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8jm\" (UniqueName: \"kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.188929 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.189099 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.189249 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.189345 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291242 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291312 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291352 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291382 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291417 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.291451 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8jm\" (UniqueName: \"kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.296654 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.297024 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.297132 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.297206 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.297392 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.308068 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8jm\" (UniqueName: \"kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm\") pod \"keystone-bootstrap-f2v2l\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.338165 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.738236 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f2v2l"] Oct 02 12:37:49 crc kubenswrapper[4929]: I1002 12:37:49.835092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f2v2l" event={"ID":"51638f83-9423-4982-a164-4483048e76a8","Type":"ContainerStarted","Data":"fa5fed65c00bff1b51598fedf9a01ac2e332793d07b5e110744170ab7e87b886"} Oct 02 12:37:50 crc kubenswrapper[4929]: I1002 12:37:50.164637 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5d218a-e650-4e97-bee7-e3f2d211606d" path="/var/lib/kubelet/pods/aa5d218a-e650-4e97-bee7-e3f2d211606d/volumes" Oct 02 12:37:50 crc kubenswrapper[4929]: I1002 12:37:50.844874 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f2v2l" event={"ID":"51638f83-9423-4982-a164-4483048e76a8","Type":"ContainerStarted","Data":"68d6dc446318d6f8a2390fe134f37903397a72dad02ebe947e0a8176f2141604"} Oct 02 12:37:50 crc kubenswrapper[4929]: I1002 12:37:50.864594 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f2v2l" podStartSLOduration=2.864574899 podStartE2EDuration="2.864574899s" podCreationTimestamp="2025-10-02 12:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:50.861751538 +0000 UTC m=+5271.412117902" watchObservedRunningTime="2025-10-02 12:37:50.864574899 +0000 UTC m=+5271.414941263" Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.354148 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.448233 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.448730 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="dnsmasq-dns" containerID="cri-o://70dfc92cff4800037a88feb3dd0f80a1ac32805dd1374353bb74f71d693bc5c7" gracePeriod=10 Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.651752 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.12:5353: connect: connection refused" Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.852921 4929 generic.go:334] "Generic (PLEG): container finished" podID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerID="70dfc92cff4800037a88feb3dd0f80a1ac32805dd1374353bb74f71d693bc5c7" exitCode=0 Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.853058 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" event={"ID":"4c854aa2-1911-4558-a3cc-0b7f2954a16e","Type":"ContainerDied","Data":"70dfc92cff4800037a88feb3dd0f80a1ac32805dd1374353bb74f71d693bc5c7"} Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.853151 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" event={"ID":"4c854aa2-1911-4558-a3cc-0b7f2954a16e","Type":"ContainerDied","Data":"52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb"} Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.853167 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a9f227890fbda96392d07c0c146826a5be2f13d8d985a18f64bc364dfb20fb" Oct 02 12:37:51 crc kubenswrapper[4929]: I1002 12:37:51.912886 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.035599 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config\") pod \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.035714 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb\") pod \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.035818 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc\") pod \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.035859 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xt9s\" (UniqueName: \"kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s\") pod \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.035935 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb\") pod \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\" (UID: \"4c854aa2-1911-4558-a3cc-0b7f2954a16e\") " Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.042106 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s" (OuterVolumeSpecName: "kube-api-access-5xt9s") pod "4c854aa2-1911-4558-a3cc-0b7f2954a16e" (UID: "4c854aa2-1911-4558-a3cc-0b7f2954a16e"). InnerVolumeSpecName "kube-api-access-5xt9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.076728 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c854aa2-1911-4558-a3cc-0b7f2954a16e" (UID: "4c854aa2-1911-4558-a3cc-0b7f2954a16e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.078022 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config" (OuterVolumeSpecName: "config") pod "4c854aa2-1911-4558-a3cc-0b7f2954a16e" (UID: "4c854aa2-1911-4558-a3cc-0b7f2954a16e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.079372 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c854aa2-1911-4558-a3cc-0b7f2954a16e" (UID: "4c854aa2-1911-4558-a3cc-0b7f2954a16e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.080029 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c854aa2-1911-4558-a3cc-0b7f2954a16e" (UID: "4c854aa2-1911-4558-a3cc-0b7f2954a16e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.137999 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.138038 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.138081 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.138091 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c854aa2-1911-4558-a3cc-0b7f2954a16e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.138100 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xt9s\" (UniqueName: \"kubernetes.io/projected/4c854aa2-1911-4558-a3cc-0b7f2954a16e-kube-api-access-5xt9s\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.863187 4929 generic.go:334] "Generic (PLEG): container finished" podID="51638f83-9423-4982-a164-4483048e76a8" containerID="68d6dc446318d6f8a2390fe134f37903397a72dad02ebe947e0a8176f2141604" exitCode=0 Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.863344 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8c66857-xtjcc" Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.863825 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f2v2l" event={"ID":"51638f83-9423-4982-a164-4483048e76a8","Type":"ContainerDied","Data":"68d6dc446318d6f8a2390fe134f37903397a72dad02ebe947e0a8176f2141604"} Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.902485 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:52 crc kubenswrapper[4929]: I1002 12:37:52.909428 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8c66857-xtjcc"] Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.154265 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.170439 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" path="/var/lib/kubelet/pods/4c854aa2-1911-4558-a3cc-0b7f2954a16e/volumes" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271452 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271525 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271576 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p8jm\" (UniqueName: \"kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271625 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271731 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.271770 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys\") pod \"51638f83-9423-4982-a164-4483048e76a8\" (UID: \"51638f83-9423-4982-a164-4483048e76a8\") " Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.277756 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.277800 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.277818 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts" (OuterVolumeSpecName: "scripts") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.278082 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm" (OuterVolumeSpecName: "kube-api-access-4p8jm") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "kube-api-access-4p8jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.294472 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.294804 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data" (OuterVolumeSpecName: "config-data") pod "51638f83-9423-4982-a164-4483048e76a8" (UID: "51638f83-9423-4982-a164-4483048e76a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374156 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374186 4929 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374199 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374207 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374216 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p8jm\" (UniqueName: \"kubernetes.io/projected/51638f83-9423-4982-a164-4483048e76a8-kube-api-access-4p8jm\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.374227 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51638f83-9423-4982-a164-4483048e76a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.883423 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f2v2l" event={"ID":"51638f83-9423-4982-a164-4483048e76a8","Type":"ContainerDied","Data":"fa5fed65c00bff1b51598fedf9a01ac2e332793d07b5e110744170ab7e87b886"} Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.883916 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5fed65c00bff1b51598fedf9a01ac2e332793d07b5e110744170ab7e87b886" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.883501 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f2v2l" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.968769 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-548d99c794-gk42m"] Oct 02 12:37:54 crc kubenswrapper[4929]: E1002 12:37:54.969182 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="init" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.969206 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="init" Oct 02 12:37:54 crc kubenswrapper[4929]: E1002 12:37:54.969223 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51638f83-9423-4982-a164-4483048e76a8" containerName="keystone-bootstrap" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.969232 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="51638f83-9423-4982-a164-4483048e76a8" containerName="keystone-bootstrap" Oct 02 12:37:54 crc kubenswrapper[4929]: E1002 12:37:54.969269 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="dnsmasq-dns" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.969276 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="dnsmasq-dns" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.969426 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="51638f83-9423-4982-a164-4483048e76a8" containerName="keystone-bootstrap" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.969454 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c854aa2-1911-4558-a3cc-0b7f2954a16e" containerName="dnsmasq-dns" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.970042 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.987931 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.989705 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.990253 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mwwg9" Oct 02 12:37:54 crc kubenswrapper[4929]: I1002 12:37:54.990302 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.008240 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548d99c794-gk42m"] Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083502 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-combined-ca-bundle\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083598 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-scripts\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083650 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-credential-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083687 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-fernet-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083732 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-config-data\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.083881 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbjl\" (UniqueName: \"kubernetes.io/projected/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-kube-api-access-whbjl\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-combined-ca-bundle\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185728 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-scripts\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185783 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-credential-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185824 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-fernet-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185869 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-config-data\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.185903 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbjl\" (UniqueName: \"kubernetes.io/projected/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-kube-api-access-whbjl\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.191281 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-scripts\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.191439 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-config-data\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.191461 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-fernet-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.191481 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-combined-ca-bundle\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.192352 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-credential-keys\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.205844 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbjl\" (UniqueName: \"kubernetes.io/projected/cdb34ab4-e82e-492e-81cb-0b9dabdf54ae-kube-api-access-whbjl\") pod \"keystone-548d99c794-gk42m\" (UID: \"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae\") " pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.299086 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.762195 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548d99c794-gk42m"] Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.894501 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548d99c794-gk42m" event={"ID":"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae","Type":"ContainerStarted","Data":"554206cd9f47a8022b59fcfef5f13229532fbd5b5aea5663606a07f4fa441990"} Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.894896 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548d99c794-gk42m" event={"ID":"cdb34ab4-e82e-492e-81cb-0b9dabdf54ae","Type":"ContainerStarted","Data":"7db50ec47d8f0a1b449e7f36cd3b1790218676f68304afac6e032a537b2d0f38"} Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.894917 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:37:55 crc kubenswrapper[4929]: I1002 12:37:55.915146 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-548d99c794-gk42m" podStartSLOduration=1.915123598 podStartE2EDuration="1.915123598s" podCreationTimestamp="2025-10-02 12:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:55.910228387 +0000 UTC m=+5276.460594751" watchObservedRunningTime="2025-10-02 12:37:55.915123598 +0000 UTC m=+5276.465489962" Oct 02 12:38:00 crc kubenswrapper[4929]: I1002 12:38:00.161098 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:38:00 crc kubenswrapper[4929]: E1002 12:38:00.161581 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:38:12 crc kubenswrapper[4929]: I1002 12:38:12.157039 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:38:12 crc kubenswrapper[4929]: E1002 12:38:12.157883 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:38:15 crc kubenswrapper[4929]: I1002 12:38:15.762577 4929 scope.go:117] "RemoveContainer" containerID="cd9a06682b761c4340da1c94f96f7152c3c128ad50bffffe92f4cb022c5a74ec" Oct 02 12:38:15 crc kubenswrapper[4929]: I1002 12:38:15.783838 4929 scope.go:117] "RemoveContainer" containerID="e8dfc5fbb9dfad623299b357a39c87237d6bc1d449764ea0f022cdda7c40fa6d" Oct 02 12:38:15 crc kubenswrapper[4929]: I1002 12:38:15.816223 4929 scope.go:117] "RemoveContainer" containerID="9bcd70608068166610e1f5a8d03f985ffc2d37b988f2e83498939f29eb96f71b" Oct 02 12:38:26 crc kubenswrapper[4929]: I1002 12:38:26.873539 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-548d99c794-gk42m" Oct 02 12:38:27 crc kubenswrapper[4929]: I1002 12:38:27.156552 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:38:27 crc kubenswrapper[4929]: E1002 12:38:27.156755 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.215372 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.216782 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.218806 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xfn95" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.220176 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.220483 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.231594 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.317710 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.317776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.318057 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdpd\" (UniqueName: \"kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.419101 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.419178 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.419256 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdpd\" (UniqueName: \"kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.420073 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.431674 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.446516 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdpd\" (UniqueName: \"kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd\") pod \"openstackclient\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.544063 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:30 crc kubenswrapper[4929]: I1002 12:38:30.973949 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:31 crc kubenswrapper[4929]: I1002 12:38:31.186056 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"071cc895-9a58-4f01-90cd-f0095a6b0f22","Type":"ContainerStarted","Data":"1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75"} Oct 02 12:38:31 crc kubenswrapper[4929]: I1002 12:38:31.186540 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"071cc895-9a58-4f01-90cd-f0095a6b0f22","Type":"ContainerStarted","Data":"108ff9e6cc381746f3f4b2ec4be3c8151793879061dbef35a9eea919b001c28a"} Oct 02 12:38:31 crc kubenswrapper[4929]: I1002 12:38:31.212937 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.212915681 podStartE2EDuration="1.212915681s" podCreationTimestamp="2025-10-02 12:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:38:31.210856842 +0000 UTC m=+5311.761223196" watchObservedRunningTime="2025-10-02 12:38:31.212915681 +0000 UTC m=+5311.763282045" Oct 02 12:38:42 crc kubenswrapper[4929]: I1002 12:38:42.157384 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:38:42 crc kubenswrapper[4929]: E1002 12:38:42.158237 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:38:55 crc kubenswrapper[4929]: I1002 12:38:55.157123 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:38:55 crc kubenswrapper[4929]: E1002 12:38:55.158075 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:39:10 crc kubenswrapper[4929]: I1002 12:39:10.161305 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:39:10 crc kubenswrapper[4929]: E1002 12:39:10.162168 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:39:21 crc kubenswrapper[4929]: I1002 12:39:21.157119 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:39:21 crc kubenswrapper[4929]: E1002 12:39:21.157718 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:39:35 crc kubenswrapper[4929]: I1002 12:39:35.157278 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:39:35 crc kubenswrapper[4929]: E1002 12:39:35.158305 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:39:49 crc kubenswrapper[4929]: I1002 12:39:49.156780 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:39:49 crc kubenswrapper[4929]: E1002 12:39:49.157598 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:03 crc kubenswrapper[4929]: I1002 12:40:03.157128 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:40:03 crc kubenswrapper[4929]: E1002 12:40:03.157879 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.206990 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.208846 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.223731 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.327950 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.328250 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rh9x\" (UniqueName: \"kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.328296 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.430089 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rh9x\" (UniqueName: \"kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.430140 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.430181 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.430674 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.430722 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.450776 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rh9x\" (UniqueName: \"kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x\") pod \"redhat-marketplace-td9gc\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.528288 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:04 crc kubenswrapper[4929]: I1002 12:40:04.976946 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:05 crc kubenswrapper[4929]: I1002 12:40:05.986120 4929 generic.go:334] "Generic (PLEG): container finished" podID="341d606e-e888-4c95-bc0b-71d4a90848de" containerID="c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760" exitCode=0 Oct 02 12:40:05 crc kubenswrapper[4929]: I1002 12:40:05.986175 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerDied","Data":"c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760"} Oct 02 12:40:05 crc kubenswrapper[4929]: I1002 12:40:05.986229 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerStarted","Data":"b059940092d1d64ca133d7b7b2d4e491389f93c94190927cdd954e413b973ba2"} Oct 02 12:40:08 crc kubenswrapper[4929]: I1002 12:40:08.003129 4929 generic.go:334] "Generic (PLEG): container finished" podID="341d606e-e888-4c95-bc0b-71d4a90848de" containerID="0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf" exitCode=0 Oct 02 12:40:08 crc kubenswrapper[4929]: I1002 12:40:08.003224 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerDied","Data":"0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf"} Oct 02 12:40:09 crc kubenswrapper[4929]: I1002 12:40:09.015735 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerStarted","Data":"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b"} Oct 02 12:40:09 crc kubenswrapper[4929]: I1002 12:40:09.038581 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-td9gc" podStartSLOduration=2.55134779 podStartE2EDuration="5.038561073s" podCreationTimestamp="2025-10-02 12:40:04 +0000 UTC" firstStartedPulling="2025-10-02 12:40:05.987546194 +0000 UTC m=+5406.537912558" lastFinishedPulling="2025-10-02 12:40:08.474759477 +0000 UTC m=+5409.025125841" observedRunningTime="2025-10-02 12:40:09.035526045 +0000 UTC m=+5409.585892409" watchObservedRunningTime="2025-10-02 12:40:09.038561073 +0000 UTC m=+5409.588927437" Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.809199 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-gnrvj"] Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.810740 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.821673 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gnrvj"] Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.845678 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx7dz\" (UniqueName: \"kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz\") pod \"barbican-db-create-gnrvj\" (UID: \"ed25c231-e5a1-4e95-91f6-ec543494202b\") " pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.946874 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx7dz\" (UniqueName: \"kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz\") pod \"barbican-db-create-gnrvj\" (UID: \"ed25c231-e5a1-4e95-91f6-ec543494202b\") " pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:11 crc kubenswrapper[4929]: I1002 12:40:11.970215 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx7dz\" (UniqueName: \"kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz\") pod \"barbican-db-create-gnrvj\" (UID: \"ed25c231-e5a1-4e95-91f6-ec543494202b\") " pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:12 crc kubenswrapper[4929]: I1002 12:40:12.134395 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:12 crc kubenswrapper[4929]: I1002 12:40:12.615522 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gnrvj"] Oct 02 12:40:12 crc kubenswrapper[4929]: W1002 12:40:12.620448 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded25c231_e5a1_4e95_91f6_ec543494202b.slice/crio-6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873 WatchSource:0}: Error finding container 6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873: Status 404 returned error can't find the container with id 6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873 Oct 02 12:40:13 crc kubenswrapper[4929]: I1002 12:40:13.052673 4929 generic.go:334] "Generic (PLEG): container finished" podID="ed25c231-e5a1-4e95-91f6-ec543494202b" containerID="9d29a1cf1c362f8b4c17aa145acfb0e868d6ae569f07f95dd6a0dbbe95172abb" exitCode=0 Oct 02 12:40:13 crc kubenswrapper[4929]: I1002 12:40:13.052728 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnrvj" event={"ID":"ed25c231-e5a1-4e95-91f6-ec543494202b","Type":"ContainerDied","Data":"9d29a1cf1c362f8b4c17aa145acfb0e868d6ae569f07f95dd6a0dbbe95172abb"} Oct 02 12:40:13 crc kubenswrapper[4929]: I1002 12:40:13.052981 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnrvj" event={"ID":"ed25c231-e5a1-4e95-91f6-ec543494202b","Type":"ContainerStarted","Data":"6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873"} Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.397125 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.499789 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx7dz\" (UniqueName: \"kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz\") pod \"ed25c231-e5a1-4e95-91f6-ec543494202b\" (UID: \"ed25c231-e5a1-4e95-91f6-ec543494202b\") " Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.506312 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz" (OuterVolumeSpecName: "kube-api-access-xx7dz") pod "ed25c231-e5a1-4e95-91f6-ec543494202b" (UID: "ed25c231-e5a1-4e95-91f6-ec543494202b"). InnerVolumeSpecName "kube-api-access-xx7dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.528546 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.528603 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.576488 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:14 crc kubenswrapper[4929]: I1002 12:40:14.601212 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx7dz\" (UniqueName: \"kubernetes.io/projected/ed25c231-e5a1-4e95-91f6-ec543494202b-kube-api-access-xx7dz\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:15 crc kubenswrapper[4929]: I1002 12:40:15.075190 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gnrvj" event={"ID":"ed25c231-e5a1-4e95-91f6-ec543494202b","Type":"ContainerDied","Data":"6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873"} Oct 02 12:40:15 crc kubenswrapper[4929]: I1002 12:40:15.075217 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gnrvj" Oct 02 12:40:15 crc kubenswrapper[4929]: I1002 12:40:15.075235 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7fd2edda26a11f0f99243144e3a6a3fed5ae0bc1a02c60cf5ba17186d7b873" Oct 02 12:40:15 crc kubenswrapper[4929]: I1002 12:40:15.133047 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:15 crc kubenswrapper[4929]: I1002 12:40:15.172711 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.104874 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-td9gc" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="registry-server" containerID="cri-o://8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b" gracePeriod=2 Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.525702 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.653221 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities\") pod \"341d606e-e888-4c95-bc0b-71d4a90848de\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.653310 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rh9x\" (UniqueName: \"kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x\") pod \"341d606e-e888-4c95-bc0b-71d4a90848de\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.653346 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content\") pod \"341d606e-e888-4c95-bc0b-71d4a90848de\" (UID: \"341d606e-e888-4c95-bc0b-71d4a90848de\") " Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.654562 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities" (OuterVolumeSpecName: "utilities") pod "341d606e-e888-4c95-bc0b-71d4a90848de" (UID: "341d606e-e888-4c95-bc0b-71d4a90848de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.666196 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x" (OuterVolumeSpecName: "kube-api-access-4rh9x") pod "341d606e-e888-4c95-bc0b-71d4a90848de" (UID: "341d606e-e888-4c95-bc0b-71d4a90848de"). InnerVolumeSpecName "kube-api-access-4rh9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.667422 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "341d606e-e888-4c95-bc0b-71d4a90848de" (UID: "341d606e-e888-4c95-bc0b-71d4a90848de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.754937 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.755020 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rh9x\" (UniqueName: \"kubernetes.io/projected/341d606e-e888-4c95-bc0b-71d4a90848de-kube-api-access-4rh9x\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:17 crc kubenswrapper[4929]: I1002 12:40:17.755066 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/341d606e-e888-4c95-bc0b-71d4a90848de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.115487 4929 generic.go:334] "Generic (PLEG): container finished" podID="341d606e-e888-4c95-bc0b-71d4a90848de" containerID="8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b" exitCode=0 Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.115555 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerDied","Data":"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b"} Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.116061 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-td9gc" event={"ID":"341d606e-e888-4c95-bc0b-71d4a90848de","Type":"ContainerDied","Data":"b059940092d1d64ca133d7b7b2d4e491389f93c94190927cdd954e413b973ba2"} Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.116092 4929 scope.go:117] "RemoveContainer" containerID="8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.115631 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-td9gc" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.137219 4929 scope.go:117] "RemoveContainer" containerID="0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.157917 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:40:18 crc kubenswrapper[4929]: E1002 12:40:18.158630 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.162780 4929 scope.go:117] "RemoveContainer" containerID="c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.173454 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.177393 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-td9gc"] Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.195203 4929 scope.go:117] "RemoveContainer" containerID="8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b" Oct 02 12:40:18 crc kubenswrapper[4929]: E1002 12:40:18.196075 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b\": container with ID starting with 8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b not found: ID does not exist" containerID="8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.196405 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b"} err="failed to get container status \"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b\": rpc error: code = NotFound desc = could not find container \"8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b\": container with ID starting with 8877b37e827077412f84d1b6966ab0817c02a96c4bcadb062aaab5422dab1d2b not found: ID does not exist" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.196488 4929 scope.go:117] "RemoveContainer" containerID="0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf" Oct 02 12:40:18 crc kubenswrapper[4929]: E1002 12:40:18.197630 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf\": container with ID starting with 0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf not found: ID does not exist" containerID="0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.197713 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf"} err="failed to get container status \"0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf\": rpc error: code = NotFound desc = could not find container \"0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf\": container with ID starting with 0a420f1feda3ff0d4b9fad70ac79c7add99d48341646a7ed4de3d83f8e5911cf not found: ID does not exist" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.197801 4929 scope.go:117] "RemoveContainer" containerID="c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760" Oct 02 12:40:18 crc kubenswrapper[4929]: E1002 12:40:18.198474 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760\": container with ID starting with c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760 not found: ID does not exist" containerID="c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760" Oct 02 12:40:18 crc kubenswrapper[4929]: I1002 12:40:18.198511 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760"} err="failed to get container status \"c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760\": rpc error: code = NotFound desc = could not find container \"c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760\": container with ID starting with c468c1e5b2da774836bf03f954a3a013b20f6125017a161cfa3cff9b702be760 not found: ID does not exist" Oct 02 12:40:20 crc kubenswrapper[4929]: I1002 12:40:20.166878 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" path="/var/lib/kubelet/pods/341d606e-e888-4c95-bc0b-71d4a90848de/volumes" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.936930 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-87e3-account-create-q47md"] Oct 02 12:40:21 crc kubenswrapper[4929]: E1002 12:40:21.937593 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="extract-utilities" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937608 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="extract-utilities" Oct 02 12:40:21 crc kubenswrapper[4929]: E1002 12:40:21.937626 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="registry-server" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937632 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="registry-server" Oct 02 12:40:21 crc kubenswrapper[4929]: E1002 12:40:21.937649 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="extract-content" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937655 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="extract-content" Oct 02 12:40:21 crc kubenswrapper[4929]: E1002 12:40:21.937666 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed25c231-e5a1-4e95-91f6-ec543494202b" containerName="mariadb-database-create" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937672 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed25c231-e5a1-4e95-91f6-ec543494202b" containerName="mariadb-database-create" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937822 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed25c231-e5a1-4e95-91f6-ec543494202b" containerName="mariadb-database-create" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.937842 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="341d606e-e888-4c95-bc0b-71d4a90848de" containerName="registry-server" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.938394 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.940519 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 12:40:21 crc kubenswrapper[4929]: I1002 12:40:21.951459 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-87e3-account-create-q47md"] Oct 02 12:40:22 crc kubenswrapper[4929]: I1002 12:40:22.033515 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k97pn\" (UniqueName: \"kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn\") pod \"barbican-87e3-account-create-q47md\" (UID: \"e12f0c57-b158-4d49-ae7f-b984511da980\") " pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:22 crc kubenswrapper[4929]: I1002 12:40:22.135515 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k97pn\" (UniqueName: \"kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn\") pod \"barbican-87e3-account-create-q47md\" (UID: \"e12f0c57-b158-4d49-ae7f-b984511da980\") " pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:22 crc kubenswrapper[4929]: I1002 12:40:22.168743 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k97pn\" (UniqueName: \"kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn\") pod \"barbican-87e3-account-create-q47md\" (UID: \"e12f0c57-b158-4d49-ae7f-b984511da980\") " pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:22 crc kubenswrapper[4929]: I1002 12:40:22.263450 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:22 crc kubenswrapper[4929]: I1002 12:40:22.669635 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-87e3-account-create-q47md"] Oct 02 12:40:23 crc kubenswrapper[4929]: I1002 12:40:23.163565 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-87e3-account-create-q47md" event={"ID":"e12f0c57-b158-4d49-ae7f-b984511da980","Type":"ContainerStarted","Data":"3cece3341682b00d935b88b232a24bc769ee27afc0d2cef4c55f91b5984e377a"} Oct 02 12:40:23 crc kubenswrapper[4929]: I1002 12:40:23.163620 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-87e3-account-create-q47md" event={"ID":"e12f0c57-b158-4d49-ae7f-b984511da980","Type":"ContainerStarted","Data":"8d02d58626a06736cb52e37db4982af133df073afca1250ce75aef12711611ba"} Oct 02 12:40:24 crc kubenswrapper[4929]: I1002 12:40:24.178467 4929 generic.go:334] "Generic (PLEG): container finished" podID="e12f0c57-b158-4d49-ae7f-b984511da980" containerID="3cece3341682b00d935b88b232a24bc769ee27afc0d2cef4c55f91b5984e377a" exitCode=0 Oct 02 12:40:24 crc kubenswrapper[4929]: I1002 12:40:24.178529 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-87e3-account-create-q47md" event={"ID":"e12f0c57-b158-4d49-ae7f-b984511da980","Type":"ContainerDied","Data":"3cece3341682b00d935b88b232a24bc769ee27afc0d2cef4c55f91b5984e377a"} Oct 02 12:40:25 crc kubenswrapper[4929]: I1002 12:40:25.467783 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:25 crc kubenswrapper[4929]: I1002 12:40:25.498501 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k97pn\" (UniqueName: \"kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn\") pod \"e12f0c57-b158-4d49-ae7f-b984511da980\" (UID: \"e12f0c57-b158-4d49-ae7f-b984511da980\") " Oct 02 12:40:25 crc kubenswrapper[4929]: I1002 12:40:25.506835 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn" (OuterVolumeSpecName: "kube-api-access-k97pn") pod "e12f0c57-b158-4d49-ae7f-b984511da980" (UID: "e12f0c57-b158-4d49-ae7f-b984511da980"). InnerVolumeSpecName "kube-api-access-k97pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:25 crc kubenswrapper[4929]: I1002 12:40:25.599901 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k97pn\" (UniqueName: \"kubernetes.io/projected/e12f0c57-b158-4d49-ae7f-b984511da980-kube-api-access-k97pn\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:26 crc kubenswrapper[4929]: I1002 12:40:26.195819 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-87e3-account-create-q47md" event={"ID":"e12f0c57-b158-4d49-ae7f-b984511da980","Type":"ContainerDied","Data":"8d02d58626a06736cb52e37db4982af133df073afca1250ce75aef12711611ba"} Oct 02 12:40:26 crc kubenswrapper[4929]: I1002 12:40:26.195868 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-87e3-account-create-q47md" Oct 02 12:40:26 crc kubenswrapper[4929]: I1002 12:40:26.195871 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d02d58626a06736cb52e37db4982af133df073afca1250ce75aef12711611ba" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.111825 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-knkjw"] Oct 02 12:40:27 crc kubenswrapper[4929]: E1002 12:40:27.112242 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12f0c57-b158-4d49-ae7f-b984511da980" containerName="mariadb-account-create" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.112265 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12f0c57-b158-4d49-ae7f-b984511da980" containerName="mariadb-account-create" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.112412 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12f0c57-b158-4d49-ae7f-b984511da980" containerName="mariadb-account-create" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.112938 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.115347 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5cmp6" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.115538 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.147283 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-knkjw"] Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.233382 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.233598 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts6st\" (UniqueName: \"kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.233626 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.335223 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts6st\" (UniqueName: \"kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.335301 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.336143 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.339888 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.340204 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.350697 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts6st\" (UniqueName: \"kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st\") pod \"barbican-db-sync-knkjw\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.436537 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:27 crc kubenswrapper[4929]: I1002 12:40:27.947144 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-knkjw"] Oct 02 12:40:27 crc kubenswrapper[4929]: W1002 12:40:27.957103 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12e4480_5f68_430b_af5a_e3c955a3b006.slice/crio-87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111 WatchSource:0}: Error finding container 87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111: Status 404 returned error can't find the container with id 87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111 Oct 02 12:40:28 crc kubenswrapper[4929]: I1002 12:40:28.210266 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-knkjw" event={"ID":"b12e4480-5f68-430b-af5a-e3c955a3b006","Type":"ContainerStarted","Data":"87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111"} Oct 02 12:40:29 crc kubenswrapper[4929]: I1002 12:40:29.223103 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-knkjw" event={"ID":"b12e4480-5f68-430b-af5a-e3c955a3b006","Type":"ContainerStarted","Data":"597660d037f8f36df21b9bbbad970f76560cda6b5dc965a6f5dce3503af2d3b5"} Oct 02 12:40:29 crc kubenswrapper[4929]: I1002 12:40:29.256884 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-knkjw" podStartSLOduration=2.256853868 podStartE2EDuration="2.256853868s" podCreationTimestamp="2025-10-02 12:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:40:29.245005517 +0000 UTC m=+5429.795371891" watchObservedRunningTime="2025-10-02 12:40:29.256853868 +0000 UTC m=+5429.807220272" Oct 02 12:40:33 crc kubenswrapper[4929]: I1002 12:40:33.158312 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:40:33 crc kubenswrapper[4929]: E1002 12:40:33.159097 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:36 crc kubenswrapper[4929]: I1002 12:40:36.301950 4929 generic.go:334] "Generic (PLEG): container finished" podID="b12e4480-5f68-430b-af5a-e3c955a3b006" containerID="597660d037f8f36df21b9bbbad970f76560cda6b5dc965a6f5dce3503af2d3b5" exitCode=0 Oct 02 12:40:36 crc kubenswrapper[4929]: I1002 12:40:36.302079 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-knkjw" event={"ID":"b12e4480-5f68-430b-af5a-e3c955a3b006","Type":"ContainerDied","Data":"597660d037f8f36df21b9bbbad970f76560cda6b5dc965a6f5dce3503af2d3b5"} Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.649917 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.717092 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle\") pod \"b12e4480-5f68-430b-af5a-e3c955a3b006\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.717161 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data\") pod \"b12e4480-5f68-430b-af5a-e3c955a3b006\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.717269 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts6st\" (UniqueName: \"kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st\") pod \"b12e4480-5f68-430b-af5a-e3c955a3b006\" (UID: \"b12e4480-5f68-430b-af5a-e3c955a3b006\") " Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.722585 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st" (OuterVolumeSpecName: "kube-api-access-ts6st") pod "b12e4480-5f68-430b-af5a-e3c955a3b006" (UID: "b12e4480-5f68-430b-af5a-e3c955a3b006"). InnerVolumeSpecName "kube-api-access-ts6st". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.723029 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b12e4480-5f68-430b-af5a-e3c955a3b006" (UID: "b12e4480-5f68-430b-af5a-e3c955a3b006"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.740728 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b12e4480-5f68-430b-af5a-e3c955a3b006" (UID: "b12e4480-5f68-430b-af5a-e3c955a3b006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.819013 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts6st\" (UniqueName: \"kubernetes.io/projected/b12e4480-5f68-430b-af5a-e3c955a3b006-kube-api-access-ts6st\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.819303 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:37 crc kubenswrapper[4929]: I1002 12:40:37.819365 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b12e4480-5f68-430b-af5a-e3c955a3b006-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.320926 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-knkjw" event={"ID":"b12e4480-5f68-430b-af5a-e3c955a3b006","Type":"ContainerDied","Data":"87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111"} Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.321306 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b7d215aa2522af7c9dcf0259d0cd6b576e1dbc045864d929540ac4be213111" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.320987 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-knkjw" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.556559 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-74bf4ccf5-t64jz"] Oct 02 12:40:38 crc kubenswrapper[4929]: E1002 12:40:38.556987 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12e4480-5f68-430b-af5a-e3c955a3b006" containerName="barbican-db-sync" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.557008 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12e4480-5f68-430b-af5a-e3c955a3b006" containerName="barbican-db-sync" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.557224 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12e4480-5f68-430b-af5a-e3c955a3b006" containerName="barbican-db-sync" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.558368 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.561917 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.562168 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.562578 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5cmp6" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.601561 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6698dc494b-48tlq"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.602871 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.606323 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.621401 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74bf4ccf5-t64jz"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630307 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630345 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2st\" (UniqueName: \"kubernetes.io/projected/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-kube-api-access-vt2st\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630373 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data-custom\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630396 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-logs\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630418 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630448 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8383e528-282b-4570-9f27-1d9ebdc46908-logs\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630468 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-combined-ca-bundle\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630491 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-combined-ca-bundle\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630532 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data-custom\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.630553 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bj7q\" (UniqueName: \"kubernetes.io/projected/8383e528-282b-4570-9f27-1d9ebdc46908-kube-api-access-9bj7q\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.652274 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6698dc494b-48tlq"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.675015 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.680806 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.688117 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733800 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-logs\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733853 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733885 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnc45\" (UniqueName: \"kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733910 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733929 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8383e528-282b-4570-9f27-1d9ebdc46908-logs\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.733947 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-combined-ca-bundle\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734474 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-logs\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734625 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734669 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-combined-ca-bundle\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734780 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data-custom\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734817 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bj7q\" (UniqueName: \"kubernetes.io/projected/8383e528-282b-4570-9f27-1d9ebdc46908-kube-api-access-9bj7q\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734882 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.734929 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.735052 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.735083 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2st\" (UniqueName: \"kubernetes.io/projected/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-kube-api-access-vt2st\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.735121 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data-custom\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.736345 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8383e528-282b-4570-9f27-1d9ebdc46908-logs\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.744464 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data-custom\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.745812 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-combined-ca-bundle\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.747399 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.747980 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8383e528-282b-4570-9f27-1d9ebdc46908-config-data\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.748241 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-config-data-custom\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.763278 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c58fcb746-wqbl8"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.763766 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2st\" (UniqueName: \"kubernetes.io/projected/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-kube-api-access-vt2st\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.764697 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.766248 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212-combined-ca-bundle\") pod \"barbican-worker-74bf4ccf5-t64jz\" (UID: \"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212\") " pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.766631 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bj7q\" (UniqueName: \"kubernetes.io/projected/8383e528-282b-4570-9f27-1d9ebdc46908-kube-api-access-9bj7q\") pod \"barbican-keystone-listener-6698dc494b-48tlq\" (UID: \"8383e528-282b-4570-9f27-1d9ebdc46908\") " pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.768980 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.783171 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c58fcb746-wqbl8"] Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.839639 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.839722 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.839773 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-combined-ca-bundle\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840168 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnc45\" (UniqueName: \"kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840238 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840286 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26g6\" (UniqueName: \"kubernetes.io/projected/e600388e-615b-43b0-a87f-0a6e4dc68ce9-kube-api-access-g26g6\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840342 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data-custom\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840444 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e600388e-615b-43b0-a87f-0a6e4dc68ce9-logs\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840506 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.840614 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.841301 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.841362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.842033 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.861466 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnc45\" (UniqueName: \"kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45\") pod \"dnsmasq-dns-99f8d8845-dh7jz\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.901737 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74bf4ccf5-t64jz" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.943285 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e600388e-615b-43b0-a87f-0a6e4dc68ce9-logs\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.943353 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.943384 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-combined-ca-bundle\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.943449 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g26g6\" (UniqueName: \"kubernetes.io/projected/e600388e-615b-43b0-a87f-0a6e4dc68ce9-kube-api-access-g26g6\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.943475 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data-custom\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.944775 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e600388e-615b-43b0-a87f-0a6e4dc68ce9-logs\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.948188 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-combined-ca-bundle\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.950986 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data-custom\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.952092 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.953983 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e600388e-615b-43b0-a87f-0a6e4dc68ce9-config-data\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:38 crc kubenswrapper[4929]: I1002 12:40:38.971203 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g26g6\" (UniqueName: \"kubernetes.io/projected/e600388e-615b-43b0-a87f-0a6e4dc68ce9-kube-api-access-g26g6\") pod \"barbican-api-6c58fcb746-wqbl8\" (UID: \"e600388e-615b-43b0-a87f-0a6e4dc68ce9\") " pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.002032 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.131434 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.437566 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74bf4ccf5-t64jz"] Oct 02 12:40:39 crc kubenswrapper[4929]: W1002 12:40:39.489774 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8383e528_282b_4570_9f27_1d9ebdc46908.slice/crio-1319195f6c57b95e8328b559e3aecac49b8d83ad97cc31a7c126ada52cef03b2 WatchSource:0}: Error finding container 1319195f6c57b95e8328b559e3aecac49b8d83ad97cc31a7c126ada52cef03b2: Status 404 returned error can't find the container with id 1319195f6c57b95e8328b559e3aecac49b8d83ad97cc31a7c126ada52cef03b2 Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.493659 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6698dc494b-48tlq"] Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.653158 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:40:39 crc kubenswrapper[4929]: I1002 12:40:39.719939 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c58fcb746-wqbl8"] Oct 02 12:40:39 crc kubenswrapper[4929]: W1002 12:40:39.732432 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode600388e_615b_43b0_a87f_0a6e4dc68ce9.slice/crio-314885a0790e532c86de260552c3c3f153e4560f1c11c6f4392ed6c3fe5bc0b5 WatchSource:0}: Error finding container 314885a0790e532c86de260552c3c3f153e4560f1c11c6f4392ed6c3fe5bc0b5: Status 404 returned error can't find the container with id 314885a0790e532c86de260552c3c3f153e4560f1c11c6f4392ed6c3fe5bc0b5 Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.344564 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" event={"ID":"8383e528-282b-4570-9f27-1d9ebdc46908","Type":"ContainerStarted","Data":"2718b81a8aed10db6b098de3e2b451357cef76389e2da9dc3cbe22fd4ce8fc27"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.345301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" event={"ID":"8383e528-282b-4570-9f27-1d9ebdc46908","Type":"ContainerStarted","Data":"6e9a6c4c3f1e19bfcbdc8ff449cd5ec2621427affbef244bc744d863ef726890"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.345318 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" event={"ID":"8383e528-282b-4570-9f27-1d9ebdc46908","Type":"ContainerStarted","Data":"1319195f6c57b95e8328b559e3aecac49b8d83ad97cc31a7c126ada52cef03b2"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.346686 4929 generic.go:334] "Generic (PLEG): container finished" podID="4751f6e2-3d6e-4ddf-9584-667668162682" containerID="fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572" exitCode=0 Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.346781 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" event={"ID":"4751f6e2-3d6e-4ddf-9584-667668162682","Type":"ContainerDied","Data":"fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.346822 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" event={"ID":"4751f6e2-3d6e-4ddf-9584-667668162682","Type":"ContainerStarted","Data":"cd429e11064a90e34dcb61af42eebab69b56076a49d577520469e8cdbfb3973d"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.348945 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c58fcb746-wqbl8" event={"ID":"e600388e-615b-43b0-a87f-0a6e4dc68ce9","Type":"ContainerStarted","Data":"58faf1357603f4ef64fab3b73ef589097d3ff686bdfb3f7406deda7f4120068a"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.349106 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c58fcb746-wqbl8" event={"ID":"e600388e-615b-43b0-a87f-0a6e4dc68ce9","Type":"ContainerStarted","Data":"314885a0790e532c86de260552c3c3f153e4560f1c11c6f4392ed6c3fe5bc0b5"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.351211 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74bf4ccf5-t64jz" event={"ID":"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212","Type":"ContainerStarted","Data":"727984d32a3709a171c086d66bee2ae4b08b637de2a91414942c2f5dd14fb46d"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.351260 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74bf4ccf5-t64jz" event={"ID":"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212","Type":"ContainerStarted","Data":"596d3a7984875c48e4e44e79b813f4451930a91a3e0cf88d9a5abf8c07edf610"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.351275 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74bf4ccf5-t64jz" event={"ID":"eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212","Type":"ContainerStarted","Data":"ce7d1d9bb1794ec7b6ce21f9e1913c25b40c25ec75a1e845e29d8b63dffa5342"} Oct 02 12:40:40 crc kubenswrapper[4929]: I1002 12:40:40.388444 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-74bf4ccf5-t64jz" podStartSLOduration=2.388427984 podStartE2EDuration="2.388427984s" podCreationTimestamp="2025-10-02 12:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:40:40.379108676 +0000 UTC m=+5440.929475040" watchObservedRunningTime="2025-10-02 12:40:40.388427984 +0000 UTC m=+5440.938794348" Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.361083 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c58fcb746-wqbl8" event={"ID":"e600388e-615b-43b0-a87f-0a6e4dc68ce9","Type":"ContainerStarted","Data":"cdf442dbc1ebbd6deff8e65a47e3375234dd9a7f02a27cc68711af5001012d34"} Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.361380 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.361392 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.364004 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" event={"ID":"4751f6e2-3d6e-4ddf-9584-667668162682","Type":"ContainerStarted","Data":"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457"} Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.381623 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c58fcb746-wqbl8" podStartSLOduration=3.381422667 podStartE2EDuration="3.381422667s" podCreationTimestamp="2025-10-02 12:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:40:41.377767582 +0000 UTC m=+5441.928133956" watchObservedRunningTime="2025-10-02 12:40:41.381422667 +0000 UTC m=+5441.931789031" Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.393252 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6698dc494b-48tlq" podStartSLOduration=3.393233707 podStartE2EDuration="3.393233707s" podCreationTimestamp="2025-10-02 12:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:40:41.39090277 +0000 UTC m=+5441.941269144" watchObservedRunningTime="2025-10-02 12:40:41.393233707 +0000 UTC m=+5441.943600071" Oct 02 12:40:41 crc kubenswrapper[4929]: I1002 12:40:41.417510 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" podStartSLOduration=3.417489754 podStartE2EDuration="3.417489754s" podCreationTimestamp="2025-10-02 12:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:40:41.410832102 +0000 UTC m=+5441.961198466" watchObservedRunningTime="2025-10-02 12:40:41.417489754 +0000 UTC m=+5441.967856108" Oct 02 12:40:42 crc kubenswrapper[4929]: I1002 12:40:42.371569 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.521083 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.523153 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.537618 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.639893 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.640056 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frsbd\" (UniqueName: \"kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.640087 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.707946 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.709802 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.727509 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742166 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55zb\" (UniqueName: \"kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742239 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742386 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742473 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frsbd\" (UniqueName: \"kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742511 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742533 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.742727 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.743006 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.764762 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frsbd\" (UniqueName: \"kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd\") pod \"community-operators-ph5t2\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.844222 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55zb\" (UniqueName: \"kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.844309 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.844353 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.845124 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.845635 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.858099 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:43 crc kubenswrapper[4929]: I1002 12:40:43.866473 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55zb\" (UniqueName: \"kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb\") pod \"certified-operators-jht8b\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:44 crc kubenswrapper[4929]: I1002 12:40:44.027482 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:44 crc kubenswrapper[4929]: I1002 12:40:44.158253 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:40:44 crc kubenswrapper[4929]: E1002 12:40:44.158492 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:44 crc kubenswrapper[4929]: I1002 12:40:44.404148 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:44 crc kubenswrapper[4929]: W1002 12:40:44.410076 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ce8248_420d_4aa6_9618_956932abab29.slice/crio-1751d24f96e1f21bab0a34e02e611f4bde9a3d7e0c4193e24ee6b5b1224d995c WatchSource:0}: Error finding container 1751d24f96e1f21bab0a34e02e611f4bde9a3d7e0c4193e24ee6b5b1224d995c: Status 404 returned error can't find the container with id 1751d24f96e1f21bab0a34e02e611f4bde9a3d7e0c4193e24ee6b5b1224d995c Oct 02 12:40:44 crc kubenswrapper[4929]: I1002 12:40:44.618883 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:44 crc kubenswrapper[4929]: W1002 12:40:44.624579 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29622c1c_a54b_4873_aea8_4e336c3ffb35.slice/crio-a6e2de74cddbefa93940dbb2fd14258f9c629226114993159084d1552b636ae9 WatchSource:0}: Error finding container a6e2de74cddbefa93940dbb2fd14258f9c629226114993159084d1552b636ae9: Status 404 returned error can't find the container with id a6e2de74cddbefa93940dbb2fd14258f9c629226114993159084d1552b636ae9 Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.397230 4929 generic.go:334] "Generic (PLEG): container finished" podID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerID="efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601" exitCode=0 Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.397298 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerDied","Data":"efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601"} Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.397609 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerStarted","Data":"a6e2de74cddbefa93940dbb2fd14258f9c629226114993159084d1552b636ae9"} Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.399533 4929 generic.go:334] "Generic (PLEG): container finished" podID="54ce8248-420d-4aa6-9618-956932abab29" containerID="6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d" exitCode=0 Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.399568 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerDied","Data":"6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d"} Oct 02 12:40:45 crc kubenswrapper[4929]: I1002 12:40:45.399590 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerStarted","Data":"1751d24f96e1f21bab0a34e02e611f4bde9a3d7e0c4193e24ee6b5b1224d995c"} Oct 02 12:40:48 crc kubenswrapper[4929]: I1002 12:40:48.427063 4929 generic.go:334] "Generic (PLEG): container finished" podID="54ce8248-420d-4aa6-9618-956932abab29" containerID="f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8" exitCode=0 Oct 02 12:40:48 crc kubenswrapper[4929]: I1002 12:40:48.427152 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerDied","Data":"f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8"} Oct 02 12:40:48 crc kubenswrapper[4929]: I1002 12:40:48.431648 4929 generic.go:334] "Generic (PLEG): container finished" podID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerID="87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf" exitCode=0 Oct 02 12:40:48 crc kubenswrapper[4929]: I1002 12:40:48.431696 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerDied","Data":"87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf"} Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.004067 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.070479 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.071071 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="dnsmasq-dns" containerID="cri-o://125168a5c86e5f696d2bf43ac4f999f364600bdb821bedcc6139c8ad9c0be647" gracePeriod=10 Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.474635 4929 generic.go:334] "Generic (PLEG): container finished" podID="d52cea74-0895-4412-8a20-d0ac843411b8" containerID="125168a5c86e5f696d2bf43ac4f999f364600bdb821bedcc6139c8ad9c0be647" exitCode=0 Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.474906 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" event={"ID":"d52cea74-0895-4412-8a20-d0ac843411b8","Type":"ContainerDied","Data":"125168a5c86e5f696d2bf43ac4f999f364600bdb821bedcc6139c8ad9c0be647"} Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.516085 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerStarted","Data":"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f"} Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.671155 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.693007 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ph5t2" podStartSLOduration=3.135151358 podStartE2EDuration="6.692949764s" podCreationTimestamp="2025-10-02 12:40:43 +0000 UTC" firstStartedPulling="2025-10-02 12:40:45.402786317 +0000 UTC m=+5445.953152671" lastFinishedPulling="2025-10-02 12:40:48.960584713 +0000 UTC m=+5449.510951077" observedRunningTime="2025-10-02 12:40:49.547119062 +0000 UTC m=+5450.097485426" watchObservedRunningTime="2025-10-02 12:40:49.692949764 +0000 UTC m=+5450.243316128" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.752033 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc\") pod \"d52cea74-0895-4412-8a20-d0ac843411b8\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.753150 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb\") pod \"d52cea74-0895-4412-8a20-d0ac843411b8\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.753249 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjsr\" (UniqueName: \"kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr\") pod \"d52cea74-0895-4412-8a20-d0ac843411b8\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.753377 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config\") pod \"d52cea74-0895-4412-8a20-d0ac843411b8\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.753481 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb\") pod \"d52cea74-0895-4412-8a20-d0ac843411b8\" (UID: \"d52cea74-0895-4412-8a20-d0ac843411b8\") " Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.762492 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr" (OuterVolumeSpecName: "kube-api-access-wrjsr") pod "d52cea74-0895-4412-8a20-d0ac843411b8" (UID: "d52cea74-0895-4412-8a20-d0ac843411b8"). InnerVolumeSpecName "kube-api-access-wrjsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.813063 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d52cea74-0895-4412-8a20-d0ac843411b8" (UID: "d52cea74-0895-4412-8a20-d0ac843411b8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.813603 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config" (OuterVolumeSpecName: "config") pod "d52cea74-0895-4412-8a20-d0ac843411b8" (UID: "d52cea74-0895-4412-8a20-d0ac843411b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.813894 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d52cea74-0895-4412-8a20-d0ac843411b8" (UID: "d52cea74-0895-4412-8a20-d0ac843411b8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.839941 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d52cea74-0895-4412-8a20-d0ac843411b8" (UID: "d52cea74-0895-4412-8a20-d0ac843411b8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.855734 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.855792 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.855805 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.855817 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d52cea74-0895-4412-8a20-d0ac843411b8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:49 crc kubenswrapper[4929]: I1002 12:40:49.855827 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrjsr\" (UniqueName: \"kubernetes.io/projected/d52cea74-0895-4412-8a20-d0ac843411b8-kube-api-access-wrjsr\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.537425 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" event={"ID":"d52cea74-0895-4412-8a20-d0ac843411b8","Type":"ContainerDied","Data":"91669054db15f729609ef052246e02a00d079020cb4ac3893e79ac3fc84975e7"} Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.537497 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc44c7cc-h4xmp" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.537508 4929 scope.go:117] "RemoveContainer" containerID="125168a5c86e5f696d2bf43ac4f999f364600bdb821bedcc6139c8ad9c0be647" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.570228 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerStarted","Data":"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88"} Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.574727 4929 scope.go:117] "RemoveContainer" containerID="2455f56c6c5085ef02d2b8650b39d2e4e73d81e6e7a93cd6da4b383d38033817" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.606548 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.638209 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fc44c7cc-h4xmp"] Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.644770 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jht8b" podStartSLOduration=3.6164297420000002 podStartE2EDuration="7.644745412s" podCreationTimestamp="2025-10-02 12:40:43 +0000 UTC" firstStartedPulling="2025-10-02 12:40:45.399206214 +0000 UTC m=+5445.949572578" lastFinishedPulling="2025-10-02 12:40:49.427521884 +0000 UTC m=+5449.977888248" observedRunningTime="2025-10-02 12:40:50.609676434 +0000 UTC m=+5451.160042798" watchObservedRunningTime="2025-10-02 12:40:50.644745412 +0000 UTC m=+5451.195111776" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.863042 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:50 crc kubenswrapper[4929]: I1002 12:40:50.941426 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c58fcb746-wqbl8" Oct 02 12:40:52 crc kubenswrapper[4929]: I1002 12:40:52.166289 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" path="/var/lib/kubelet/pods/d52cea74-0895-4412-8a20-d0ac843411b8/volumes" Oct 02 12:40:53 crc kubenswrapper[4929]: I1002 12:40:53.860441 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:53 crc kubenswrapper[4929]: I1002 12:40:53.860684 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:53 crc kubenswrapper[4929]: I1002 12:40:53.904673 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:54 crc kubenswrapper[4929]: I1002 12:40:54.028206 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:54 crc kubenswrapper[4929]: I1002 12:40:54.028265 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:54 crc kubenswrapper[4929]: I1002 12:40:54.072497 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:54 crc kubenswrapper[4929]: I1002 12:40:54.675799 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:54 crc kubenswrapper[4929]: I1002 12:40:54.703113 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:55 crc kubenswrapper[4929]: I1002 12:40:55.158289 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:40:55 crc kubenswrapper[4929]: E1002 12:40:55.159746 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:40:55 crc kubenswrapper[4929]: I1002 12:40:55.700455 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:56 crc kubenswrapper[4929]: I1002 12:40:56.643592 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ph5t2" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="registry-server" containerID="cri-o://13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f" gracePeriod=2 Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.043094 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.110577 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.110774 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jht8b" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="registry-server" containerID="cri-o://065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88" gracePeriod=2 Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.202149 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities\") pod \"54ce8248-420d-4aa6-9618-956932abab29\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.202250 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content\") pod \"54ce8248-420d-4aa6-9618-956932abab29\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.202334 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frsbd\" (UniqueName: \"kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd\") pod \"54ce8248-420d-4aa6-9618-956932abab29\" (UID: \"54ce8248-420d-4aa6-9618-956932abab29\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.203678 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities" (OuterVolumeSpecName: "utilities") pod "54ce8248-420d-4aa6-9618-956932abab29" (UID: "54ce8248-420d-4aa6-9618-956932abab29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.209957 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd" (OuterVolumeSpecName: "kube-api-access-frsbd") pod "54ce8248-420d-4aa6-9618-956932abab29" (UID: "54ce8248-420d-4aa6-9618-956932abab29"). InnerVolumeSpecName "kube-api-access-frsbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.267199 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54ce8248-420d-4aa6-9618-956932abab29" (UID: "54ce8248-420d-4aa6-9618-956932abab29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.304361 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frsbd\" (UniqueName: \"kubernetes.io/projected/54ce8248-420d-4aa6-9618-956932abab29-kube-api-access-frsbd\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.304390 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.304399 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ce8248-420d-4aa6-9618-956932abab29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.445572 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.610273 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities\") pod \"29622c1c-a54b-4873-aea8-4e336c3ffb35\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.610718 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55zb\" (UniqueName: \"kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb\") pod \"29622c1c-a54b-4873-aea8-4e336c3ffb35\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.610884 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content\") pod \"29622c1c-a54b-4873-aea8-4e336c3ffb35\" (UID: \"29622c1c-a54b-4873-aea8-4e336c3ffb35\") " Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.611173 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities" (OuterVolumeSpecName: "utilities") pod "29622c1c-a54b-4873-aea8-4e336c3ffb35" (UID: "29622c1c-a54b-4873-aea8-4e336c3ffb35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.613827 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.615036 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb" (OuterVolumeSpecName: "kube-api-access-r55zb") pod "29622c1c-a54b-4873-aea8-4e336c3ffb35" (UID: "29622c1c-a54b-4873-aea8-4e336c3ffb35"). InnerVolumeSpecName "kube-api-access-r55zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.652843 4929 generic.go:334] "Generic (PLEG): container finished" podID="54ce8248-420d-4aa6-9618-956932abab29" containerID="13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f" exitCode=0 Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.652913 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerDied","Data":"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f"} Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.652925 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ph5t2" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.652939 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ph5t2" event={"ID":"54ce8248-420d-4aa6-9618-956932abab29","Type":"ContainerDied","Data":"1751d24f96e1f21bab0a34e02e611f4bde9a3d7e0c4193e24ee6b5b1224d995c"} Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.652975 4929 scope.go:117] "RemoveContainer" containerID="13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.657752 4929 generic.go:334] "Generic (PLEG): container finished" podID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerID="065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88" exitCode=0 Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.657794 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jht8b" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.657802 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerDied","Data":"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88"} Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.657830 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jht8b" event={"ID":"29622c1c-a54b-4873-aea8-4e336c3ffb35","Type":"ContainerDied","Data":"a6e2de74cddbefa93940dbb2fd14258f9c629226114993159084d1552b636ae9"} Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.660451 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29622c1c-a54b-4873-aea8-4e336c3ffb35" (UID: "29622c1c-a54b-4873-aea8-4e336c3ffb35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.682349 4929 scope.go:117] "RemoveContainer" containerID="f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.691065 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.697154 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ph5t2"] Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.715870 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55zb\" (UniqueName: \"kubernetes.io/projected/29622c1c-a54b-4873-aea8-4e336c3ffb35-kube-api-access-r55zb\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.715906 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29622c1c-a54b-4873-aea8-4e336c3ffb35-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.716500 4929 scope.go:117] "RemoveContainer" containerID="6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.747552 4929 scope.go:117] "RemoveContainer" containerID="13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.748091 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f\": container with ID starting with 13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f not found: ID does not exist" containerID="13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.748132 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f"} err="failed to get container status \"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f\": rpc error: code = NotFound desc = could not find container \"13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f\": container with ID starting with 13e26acd85673bee5f1d0fce0c3109e16d626a8a35e963c6ec7539bbcc35216f not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.748162 4929 scope.go:117] "RemoveContainer" containerID="f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.748571 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8\": container with ID starting with f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8 not found: ID does not exist" containerID="f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.748636 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8"} err="failed to get container status \"f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8\": rpc error: code = NotFound desc = could not find container \"f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8\": container with ID starting with f755b55f85a8cd31467fc50ac2013595175a843c45723c3cea167102517abcc8 not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.748664 4929 scope.go:117] "RemoveContainer" containerID="6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.751273 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d\": container with ID starting with 6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d not found: ID does not exist" containerID="6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.751311 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d"} err="failed to get container status \"6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d\": rpc error: code = NotFound desc = could not find container \"6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d\": container with ID starting with 6d8c540e61c6750983696511a933caeed60e56d8b5842434ab34ba2d4dec941d not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.751327 4929 scope.go:117] "RemoveContainer" containerID="065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.802285 4929 scope.go:117] "RemoveContainer" containerID="87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.832043 4929 scope.go:117] "RemoveContainer" containerID="efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.859829 4929 scope.go:117] "RemoveContainer" containerID="065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.860388 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88\": container with ID starting with 065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88 not found: ID does not exist" containerID="065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.860421 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88"} err="failed to get container status \"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88\": rpc error: code = NotFound desc = could not find container \"065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88\": container with ID starting with 065da20c0036b8d71c87d3145521a668809e486680c60f7aec537643bc9c8d88 not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.860445 4929 scope.go:117] "RemoveContainer" containerID="87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.860757 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf\": container with ID starting with 87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf not found: ID does not exist" containerID="87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.860817 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf"} err="failed to get container status \"87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf\": rpc error: code = NotFound desc = could not find container \"87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf\": container with ID starting with 87ceca1487fd4db9ba76554f45e7e79170e38a203789e610f3f7c526ad744ecf not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.860851 4929 scope.go:117] "RemoveContainer" containerID="efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601" Oct 02 12:40:57 crc kubenswrapper[4929]: E1002 12:40:57.861194 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601\": container with ID starting with efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601 not found: ID does not exist" containerID="efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.861245 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601"} err="failed to get container status \"efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601\": rpc error: code = NotFound desc = could not find container \"efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601\": container with ID starting with efa3516b5323cf54c1f245cc9bb2b29b83a7387c66d99019629793dc0d724601 not found: ID does not exist" Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.994243 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:57 crc kubenswrapper[4929]: I1002 12:40:57.999524 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jht8b"] Oct 02 12:40:58 crc kubenswrapper[4929]: I1002 12:40:58.172832 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" path="/var/lib/kubelet/pods/29622c1c-a54b-4873-aea8-4e336c3ffb35/volumes" Oct 02 12:40:58 crc kubenswrapper[4929]: I1002 12:40:58.174887 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ce8248-420d-4aa6-9618-956932abab29" path="/var/lib/kubelet/pods/54ce8248-420d-4aa6-9618-956932abab29/volumes" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.880526 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-v687k"] Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881666 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="extract-content" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881691 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="extract-content" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881752 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="dnsmasq-dns" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881762 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="dnsmasq-dns" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881788 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="extract-utilities" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881821 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="extract-utilities" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881841 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="extract-content" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881850 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="extract-content" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881871 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="init" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881880 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="init" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881899 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="extract-utilities" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881910 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="extract-utilities" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881925 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881934 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: E1002 12:41:04.881975 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.881986 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.882227 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ce8248-420d-4aa6-9618-956932abab29" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.882241 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="29622c1c-a54b-4873-aea8-4e336c3ffb35" containerName="registry-server" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.882278 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52cea74-0895-4412-8a20-d0ac843411b8" containerName="dnsmasq-dns" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.883291 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v687k" Oct 02 12:41:04 crc kubenswrapper[4929]: I1002 12:41:04.894793 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v687k"] Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.058380 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlq4c\" (UniqueName: \"kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c\") pod \"neutron-db-create-v687k\" (UID: \"bf2a7fe3-11b1-412d-a810-e24dbf0d7656\") " pod="openstack/neutron-db-create-v687k" Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.160891 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlq4c\" (UniqueName: \"kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c\") pod \"neutron-db-create-v687k\" (UID: \"bf2a7fe3-11b1-412d-a810-e24dbf0d7656\") " pod="openstack/neutron-db-create-v687k" Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.181258 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlq4c\" (UniqueName: \"kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c\") pod \"neutron-db-create-v687k\" (UID: \"bf2a7fe3-11b1-412d-a810-e24dbf0d7656\") " pod="openstack/neutron-db-create-v687k" Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.202593 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v687k" Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.701256 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-v687k"] Oct 02 12:41:05 crc kubenswrapper[4929]: I1002 12:41:05.739167 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v687k" event={"ID":"bf2a7fe3-11b1-412d-a810-e24dbf0d7656","Type":"ContainerStarted","Data":"dad084c84ec01e5b1f40ee764640f0a45c46d4b7bdf388c7c5daa85e13817825"} Oct 02 12:41:06 crc kubenswrapper[4929]: I1002 12:41:06.157984 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:41:06 crc kubenswrapper[4929]: E1002 12:41:06.160298 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:41:06 crc kubenswrapper[4929]: I1002 12:41:06.750377 4929 generic.go:334] "Generic (PLEG): container finished" podID="bf2a7fe3-11b1-412d-a810-e24dbf0d7656" containerID="6876c249d2f639ad5f545507923c0f54fae7595929e6b936beb224aa6b535ec4" exitCode=0 Oct 02 12:41:06 crc kubenswrapper[4929]: I1002 12:41:06.750455 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v687k" event={"ID":"bf2a7fe3-11b1-412d-a810-e24dbf0d7656","Type":"ContainerDied","Data":"6876c249d2f639ad5f545507923c0f54fae7595929e6b936beb224aa6b535ec4"} Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.077533 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v687k" Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.220388 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlq4c\" (UniqueName: \"kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c\") pod \"bf2a7fe3-11b1-412d-a810-e24dbf0d7656\" (UID: \"bf2a7fe3-11b1-412d-a810-e24dbf0d7656\") " Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.228693 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c" (OuterVolumeSpecName: "kube-api-access-rlq4c") pod "bf2a7fe3-11b1-412d-a810-e24dbf0d7656" (UID: "bf2a7fe3-11b1-412d-a810-e24dbf0d7656"). InnerVolumeSpecName "kube-api-access-rlq4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.323489 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlq4c\" (UniqueName: \"kubernetes.io/projected/bf2a7fe3-11b1-412d-a810-e24dbf0d7656-kube-api-access-rlq4c\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.770646 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-v687k" event={"ID":"bf2a7fe3-11b1-412d-a810-e24dbf0d7656","Type":"ContainerDied","Data":"dad084c84ec01e5b1f40ee764640f0a45c46d4b7bdf388c7c5daa85e13817825"} Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.771066 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dad084c84ec01e5b1f40ee764640f0a45c46d4b7bdf388c7c5daa85e13817825" Oct 02 12:41:08 crc kubenswrapper[4929]: I1002 12:41:08.770707 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-v687k" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.029546 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-89fe-account-create-xvbc2"] Oct 02 12:41:15 crc kubenswrapper[4929]: E1002 12:41:15.030539 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2a7fe3-11b1-412d-a810-e24dbf0d7656" containerName="mariadb-database-create" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.030553 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2a7fe3-11b1-412d-a810-e24dbf0d7656" containerName="mariadb-database-create" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.030697 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2a7fe3-11b1-412d-a810-e24dbf0d7656" containerName="mariadb-database-create" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.031329 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.034047 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.039072 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-89fe-account-create-xvbc2"] Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.145354 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95lpm\" (UniqueName: \"kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm\") pod \"neutron-89fe-account-create-xvbc2\" (UID: \"6a6695bb-e510-4974-94c6-9f18ebe5af7e\") " pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.248279 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95lpm\" (UniqueName: \"kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm\") pod \"neutron-89fe-account-create-xvbc2\" (UID: \"6a6695bb-e510-4974-94c6-9f18ebe5af7e\") " pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.273327 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95lpm\" (UniqueName: \"kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm\") pod \"neutron-89fe-account-create-xvbc2\" (UID: \"6a6695bb-e510-4974-94c6-9f18ebe5af7e\") " pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.365944 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.805167 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-89fe-account-create-xvbc2"] Oct 02 12:41:15 crc kubenswrapper[4929]: I1002 12:41:15.836044 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89fe-account-create-xvbc2" event={"ID":"6a6695bb-e510-4974-94c6-9f18ebe5af7e","Type":"ContainerStarted","Data":"c588c9df006d02e90c7f1f5ccfcb564cb9aa4fdb495290b2172f2e0c3e665a4f"} Oct 02 12:41:16 crc kubenswrapper[4929]: I1002 12:41:16.845481 4929 generic.go:334] "Generic (PLEG): container finished" podID="6a6695bb-e510-4974-94c6-9f18ebe5af7e" containerID="6d7fb57df0b2ded2b8c3068839405a760768eb97113eda8bd4017ceae45306c1" exitCode=0 Oct 02 12:41:16 crc kubenswrapper[4929]: I1002 12:41:16.845547 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89fe-account-create-xvbc2" event={"ID":"6a6695bb-e510-4974-94c6-9f18ebe5af7e","Type":"ContainerDied","Data":"6d7fb57df0b2ded2b8c3068839405a760768eb97113eda8bd4017ceae45306c1"} Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.161244 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.299301 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95lpm\" (UniqueName: \"kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm\") pod \"6a6695bb-e510-4974-94c6-9f18ebe5af7e\" (UID: \"6a6695bb-e510-4974-94c6-9f18ebe5af7e\") " Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.304035 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm" (OuterVolumeSpecName: "kube-api-access-95lpm") pod "6a6695bb-e510-4974-94c6-9f18ebe5af7e" (UID: "6a6695bb-e510-4974-94c6-9f18ebe5af7e"). InnerVolumeSpecName "kube-api-access-95lpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.401132 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95lpm\" (UniqueName: \"kubernetes.io/projected/6a6695bb-e510-4974-94c6-9f18ebe5af7e-kube-api-access-95lpm\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.863602 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89fe-account-create-xvbc2" event={"ID":"6a6695bb-e510-4974-94c6-9f18ebe5af7e","Type":"ContainerDied","Data":"c588c9df006d02e90c7f1f5ccfcb564cb9aa4fdb495290b2172f2e0c3e665a4f"} Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.863646 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c588c9df006d02e90c7f1f5ccfcb564cb9aa4fdb495290b2172f2e0c3e665a4f" Oct 02 12:41:18 crc kubenswrapper[4929]: I1002 12:41:18.863655 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89fe-account-create-xvbc2" Oct 02 12:41:19 crc kubenswrapper[4929]: I1002 12:41:19.156527 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:41:19 crc kubenswrapper[4929]: E1002 12:41:19.156934 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.177055 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-k7z8p"] Oct 02 12:41:20 crc kubenswrapper[4929]: E1002 12:41:20.177796 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a6695bb-e510-4974-94c6-9f18ebe5af7e" containerName="mariadb-account-create" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.177812 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a6695bb-e510-4974-94c6-9f18ebe5af7e" containerName="mariadb-account-create" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.178034 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a6695bb-e510-4974-94c6-9f18ebe5af7e" containerName="mariadb-account-create" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.178686 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.181423 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.181704 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-km26f" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.181999 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.186466 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k7z8p"] Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.328378 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.328722 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.328799 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv2nn\" (UniqueName: \"kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.430479 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.430922 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.431059 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv2nn\" (UniqueName: \"kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.437943 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.439714 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.460652 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv2nn\" (UniqueName: \"kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn\") pod \"neutron-db-sync-k7z8p\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:20 crc kubenswrapper[4929]: I1002 12:41:20.506757 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:21 crc kubenswrapper[4929]: I1002 12:41:21.000699 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k7z8p"] Oct 02 12:41:21 crc kubenswrapper[4929]: I1002 12:41:21.895774 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k7z8p" event={"ID":"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8","Type":"ContainerStarted","Data":"1ca05a5496be8ad4a20e3f2664248a7507a36abb73e0f9db27329ed932ff256b"} Oct 02 12:41:21 crc kubenswrapper[4929]: I1002 12:41:21.896082 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k7z8p" event={"ID":"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8","Type":"ContainerStarted","Data":"7e3852f342578a50f04af85dd1141a689e3104783628adabe43dcd86194d599d"} Oct 02 12:41:21 crc kubenswrapper[4929]: I1002 12:41:21.915051 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-k7z8p" podStartSLOduration=1.915033317 podStartE2EDuration="1.915033317s" podCreationTimestamp="2025-10-02 12:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:41:21.90747491 +0000 UTC m=+5482.457841284" watchObservedRunningTime="2025-10-02 12:41:21.915033317 +0000 UTC m=+5482.465399681" Oct 02 12:41:25 crc kubenswrapper[4929]: I1002 12:41:25.928613 4929 generic.go:334] "Generic (PLEG): container finished" podID="01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" containerID="1ca05a5496be8ad4a20e3f2664248a7507a36abb73e0f9db27329ed932ff256b" exitCode=0 Oct 02 12:41:25 crc kubenswrapper[4929]: I1002 12:41:25.928648 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k7z8p" event={"ID":"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8","Type":"ContainerDied","Data":"1ca05a5496be8ad4a20e3f2664248a7507a36abb73e0f9db27329ed932ff256b"} Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.205072 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.264456 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv2nn\" (UniqueName: \"kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn\") pod \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.264530 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle\") pod \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.264660 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config\") pod \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\" (UID: \"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8\") " Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.269427 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn" (OuterVolumeSpecName: "kube-api-access-cv2nn") pod "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" (UID: "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8"). InnerVolumeSpecName "kube-api-access-cv2nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.287507 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" (UID: "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.291712 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config" (OuterVolumeSpecName: "config") pod "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" (UID: "01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.367502 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv2nn\" (UniqueName: \"kubernetes.io/projected/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-kube-api-access-cv2nn\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.367541 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.367613 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.945168 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k7z8p" event={"ID":"01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8","Type":"ContainerDied","Data":"7e3852f342578a50f04af85dd1141a689e3104783628adabe43dcd86194d599d"} Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.945214 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e3852f342578a50f04af85dd1141a689e3104783628adabe43dcd86194d599d" Oct 02 12:41:27 crc kubenswrapper[4929]: I1002 12:41:27.945243 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k7z8p" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.186393 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:41:28 crc kubenswrapper[4929]: E1002 12:41:28.187020 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" containerName="neutron-db-sync" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.187137 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" containerName="neutron-db-sync" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.187450 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" containerName="neutron-db-sync" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.188725 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.204467 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.289117 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.289227 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.289285 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.289717 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.289776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8zfb\" (UniqueName: \"kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.328646 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5db984bf45-rgjlr"] Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.330093 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.332014 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.332180 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-km26f" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.332500 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.343494 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5db984bf45-rgjlr"] Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391414 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391481 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391507 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8zfb\" (UniqueName: \"kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391552 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391591 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-combined-ca-bundle\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391629 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391675 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-httpd-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391704 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.391749 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cktw\" (UniqueName: \"kubernetes.io/projected/eabfaed5-9966-4922-bad1-f5cf35ab06eb-kube-api-access-9cktw\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.392338 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.392792 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.392895 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.393922 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.424607 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8zfb\" (UniqueName: \"kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb\") pod \"dnsmasq-dns-b84f67b9c-svnjr\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.492950 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cktw\" (UniqueName: \"kubernetes.io/projected/eabfaed5-9966-4922-bad1-f5cf35ab06eb-kube-api-access-9cktw\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.493142 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.493225 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-combined-ca-bundle\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.493294 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-httpd-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.497797 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-httpd-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.498097 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-config\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.501234 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabfaed5-9966-4922-bad1-f5cf35ab06eb-combined-ca-bundle\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.507746 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.513298 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cktw\" (UniqueName: \"kubernetes.io/projected/eabfaed5-9966-4922-bad1-f5cf35ab06eb-kube-api-access-9cktw\") pod \"neutron-5db984bf45-rgjlr\" (UID: \"eabfaed5-9966-4922-bad1-f5cf35ab06eb\") " pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:28 crc kubenswrapper[4929]: I1002 12:41:28.651420 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.020946 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.273166 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5db984bf45-rgjlr"] Oct 02 12:41:29 crc kubenswrapper[4929]: W1002 12:41:29.309666 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeabfaed5_9966_4922_bad1_f5cf35ab06eb.slice/crio-9ccfc71535143255853cfe2f6232d0734002a04a8ae4e04f502af39463f470b2 WatchSource:0}: Error finding container 9ccfc71535143255853cfe2f6232d0734002a04a8ae4e04f502af39463f470b2: Status 404 returned error can't find the container with id 9ccfc71535143255853cfe2f6232d0734002a04a8ae4e04f502af39463f470b2 Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.961660 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5db984bf45-rgjlr" event={"ID":"eabfaed5-9966-4922-bad1-f5cf35ab06eb","Type":"ContainerStarted","Data":"d1213abbab4bc6b857f12d94f023f644ad2b91e03a5a995e1152bef1fe8a265c"} Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.962069 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5db984bf45-rgjlr" event={"ID":"eabfaed5-9966-4922-bad1-f5cf35ab06eb","Type":"ContainerStarted","Data":"049017c3e4e7cfb9e57224bbbd379f5738c2e61a57face255d5f643de47b7361"} Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.962089 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5db984bf45-rgjlr" event={"ID":"eabfaed5-9966-4922-bad1-f5cf35ab06eb","Type":"ContainerStarted","Data":"9ccfc71535143255853cfe2f6232d0734002a04a8ae4e04f502af39463f470b2"} Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.962173 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.963655 4929 generic.go:334] "Generic (PLEG): container finished" podID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerID="4660f85746d9dc833cfb15cd11f1678298ae51af202e38e485aa91eeef8a835a" exitCode=0 Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.963693 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" event={"ID":"2b4b3e26-83a9-4640-866e-17e037bdbbf9","Type":"ContainerDied","Data":"4660f85746d9dc833cfb15cd11f1678298ae51af202e38e485aa91eeef8a835a"} Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.963713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" event={"ID":"2b4b3e26-83a9-4640-866e-17e037bdbbf9","Type":"ContainerStarted","Data":"848502dbff8e1d8bdc88495ef5d5d1a691510236ebceb637baab705a031bc7f4"} Oct 02 12:41:29 crc kubenswrapper[4929]: I1002 12:41:29.999159 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5db984bf45-rgjlr" podStartSLOduration=1.999137276 podStartE2EDuration="1.999137276s" podCreationTimestamp="2025-10-02 12:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:41:29.984089214 +0000 UTC m=+5490.534455588" watchObservedRunningTime="2025-10-02 12:41:29.999137276 +0000 UTC m=+5490.549503640" Oct 02 12:41:30 crc kubenswrapper[4929]: I1002 12:41:30.974345 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" event={"ID":"2b4b3e26-83a9-4640-866e-17e037bdbbf9","Type":"ContainerStarted","Data":"2e7881f32499e05fb1678074b477846d647205c96e4702c4d86791d0e2a847ff"} Oct 02 12:41:30 crc kubenswrapper[4929]: I1002 12:41:30.998470 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" podStartSLOduration=2.998434351 podStartE2EDuration="2.998434351s" podCreationTimestamp="2025-10-02 12:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:41:30.993413346 +0000 UTC m=+5491.543779720" watchObservedRunningTime="2025-10-02 12:41:30.998434351 +0000 UTC m=+5491.548800725" Oct 02 12:41:31 crc kubenswrapper[4929]: I1002 12:41:31.156413 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:41:31 crc kubenswrapper[4929]: E1002 12:41:31.156646 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:41:31 crc kubenswrapper[4929]: I1002 12:41:31.981813 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:38 crc kubenswrapper[4929]: I1002 12:41:38.509243 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:41:38 crc kubenswrapper[4929]: I1002 12:41:38.568757 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:41:38 crc kubenswrapper[4929]: I1002 12:41:38.569061 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="dnsmasq-dns" containerID="cri-o://6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457" gracePeriod=10 Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.017344 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.050105 4929 generic.go:334] "Generic (PLEG): container finished" podID="4751f6e2-3d6e-4ddf-9584-667668162682" containerID="6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457" exitCode=0 Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.050160 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" event={"ID":"4751f6e2-3d6e-4ddf-9584-667668162682","Type":"ContainerDied","Data":"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457"} Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.050191 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" event={"ID":"4751f6e2-3d6e-4ddf-9584-667668162682","Type":"ContainerDied","Data":"cd429e11064a90e34dcb61af42eebab69b56076a49d577520469e8cdbfb3973d"} Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.050211 4929 scope.go:117] "RemoveContainer" containerID="6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.050353 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.077433 4929 scope.go:117] "RemoveContainer" containerID="fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.101551 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb\") pod \"4751f6e2-3d6e-4ddf-9584-667668162682\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.101860 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config\") pod \"4751f6e2-3d6e-4ddf-9584-667668162682\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.101899 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb\") pod \"4751f6e2-3d6e-4ddf-9584-667668162682\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.101916 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc\") pod \"4751f6e2-3d6e-4ddf-9584-667668162682\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.102044 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnc45\" (UniqueName: \"kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45\") pod \"4751f6e2-3d6e-4ddf-9584-667668162682\" (UID: \"4751f6e2-3d6e-4ddf-9584-667668162682\") " Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.106874 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45" (OuterVolumeSpecName: "kube-api-access-mnc45") pod "4751f6e2-3d6e-4ddf-9584-667668162682" (UID: "4751f6e2-3d6e-4ddf-9584-667668162682"). InnerVolumeSpecName "kube-api-access-mnc45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.119340 4929 scope.go:117] "RemoveContainer" containerID="6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457" Oct 02 12:41:39 crc kubenswrapper[4929]: E1002 12:41:39.119809 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457\": container with ID starting with 6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457 not found: ID does not exist" containerID="6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.119838 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457"} err="failed to get container status \"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457\": rpc error: code = NotFound desc = could not find container \"6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457\": container with ID starting with 6b066fb245155d60687f1354f352c2fdf3d1c5e67c219f9ef4c56b3fc9e25457 not found: ID does not exist" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.119861 4929 scope.go:117] "RemoveContainer" containerID="fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572" Oct 02 12:41:39 crc kubenswrapper[4929]: E1002 12:41:39.120215 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572\": container with ID starting with fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572 not found: ID does not exist" containerID="fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.120232 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572"} err="failed to get container status \"fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572\": rpc error: code = NotFound desc = could not find container \"fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572\": container with ID starting with fd10a0be3f3c18e3b5be88182d454c09f9fec74ec4881db28802152e5c8e3572 not found: ID does not exist" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.142634 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config" (OuterVolumeSpecName: "config") pod "4751f6e2-3d6e-4ddf-9584-667668162682" (UID: "4751f6e2-3d6e-4ddf-9584-667668162682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.144259 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4751f6e2-3d6e-4ddf-9584-667668162682" (UID: "4751f6e2-3d6e-4ddf-9584-667668162682"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.154416 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4751f6e2-3d6e-4ddf-9584-667668162682" (UID: "4751f6e2-3d6e-4ddf-9584-667668162682"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.157631 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4751f6e2-3d6e-4ddf-9584-667668162682" (UID: "4751f6e2-3d6e-4ddf-9584-667668162682"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.207947 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnc45\" (UniqueName: \"kubernetes.io/projected/4751f6e2-3d6e-4ddf-9584-667668162682-kube-api-access-mnc45\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.208012 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.208033 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.210188 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.210222 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4751f6e2-3d6e-4ddf-9584-667668162682-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.380231 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:41:39 crc kubenswrapper[4929]: I1002 12:41:39.390468 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-99f8d8845-dh7jz"] Oct 02 12:41:40 crc kubenswrapper[4929]: I1002 12:41:40.169585 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" path="/var/lib/kubelet/pods/4751f6e2-3d6e-4ddf-9584-667668162682/volumes" Oct 02 12:41:42 crc kubenswrapper[4929]: I1002 12:41:42.158122 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:41:42 crc kubenswrapper[4929]: E1002 12:41:42.159029 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:41:44 crc kubenswrapper[4929]: I1002 12:41:44.003294 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-99f8d8845-dh7jz" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.29:5353: i/o timeout" Oct 02 12:41:55 crc kubenswrapper[4929]: I1002 12:41:55.157211 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:41:55 crc kubenswrapper[4929]: E1002 12:41:55.158221 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:41:58 crc kubenswrapper[4929]: I1002 12:41:58.660829 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5db984bf45-rgjlr" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.962539 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qps82"] Oct 02 12:42:05 crc kubenswrapper[4929]: E1002 12:42:05.963730 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="dnsmasq-dns" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.963826 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="dnsmasq-dns" Oct 02 12:42:05 crc kubenswrapper[4929]: E1002 12:42:05.963857 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="init" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.963866 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="init" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.964121 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4751f6e2-3d6e-4ddf-9584-667668162682" containerName="dnsmasq-dns" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.965010 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qps82" Oct 02 12:42:05 crc kubenswrapper[4929]: I1002 12:42:05.975220 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qps82"] Oct 02 12:42:06 crc kubenswrapper[4929]: I1002 12:42:06.027912 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlqh\" (UniqueName: \"kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh\") pod \"glance-db-create-qps82\" (UID: \"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee\") " pod="openstack/glance-db-create-qps82" Oct 02 12:42:06 crc kubenswrapper[4929]: I1002 12:42:06.130901 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btlqh\" (UniqueName: \"kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh\") pod \"glance-db-create-qps82\" (UID: \"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee\") " pod="openstack/glance-db-create-qps82" Oct 02 12:42:06 crc kubenswrapper[4929]: I1002 12:42:06.152945 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btlqh\" (UniqueName: \"kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh\") pod \"glance-db-create-qps82\" (UID: \"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee\") " pod="openstack/glance-db-create-qps82" Oct 02 12:42:06 crc kubenswrapper[4929]: I1002 12:42:06.294661 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qps82" Oct 02 12:42:06 crc kubenswrapper[4929]: I1002 12:42:06.794456 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qps82"] Oct 02 12:42:07 crc kubenswrapper[4929]: I1002 12:42:07.354007 4929 generic.go:334] "Generic (PLEG): container finished" podID="11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" containerID="a29e5e5b7fea7c8049995bba6b38042f5e1721a48dccaee00a451c03da7c1c00" exitCode=0 Oct 02 12:42:07 crc kubenswrapper[4929]: I1002 12:42:07.354158 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qps82" event={"ID":"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee","Type":"ContainerDied","Data":"a29e5e5b7fea7c8049995bba6b38042f5e1721a48dccaee00a451c03da7c1c00"} Oct 02 12:42:07 crc kubenswrapper[4929]: I1002 12:42:07.354345 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qps82" event={"ID":"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee","Type":"ContainerStarted","Data":"502d94b5b4ba207a7fa58c4d12892f1781eea3a28e55d863390d4a363a3b3263"} Oct 02 12:42:08 crc kubenswrapper[4929]: I1002 12:42:08.712678 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qps82" Oct 02 12:42:08 crc kubenswrapper[4929]: I1002 12:42:08.784437 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btlqh\" (UniqueName: \"kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh\") pod \"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee\" (UID: \"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee\") " Oct 02 12:42:08 crc kubenswrapper[4929]: I1002 12:42:08.794599 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh" (OuterVolumeSpecName: "kube-api-access-btlqh") pod "11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" (UID: "11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee"). InnerVolumeSpecName "kube-api-access-btlqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:08 crc kubenswrapper[4929]: I1002 12:42:08.885723 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btlqh\" (UniqueName: \"kubernetes.io/projected/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee-kube-api-access-btlqh\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:09 crc kubenswrapper[4929]: I1002 12:42:09.157406 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:42:09 crc kubenswrapper[4929]: E1002 12:42:09.157692 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:42:09 crc kubenswrapper[4929]: I1002 12:42:09.383176 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qps82" event={"ID":"11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee","Type":"ContainerDied","Data":"502d94b5b4ba207a7fa58c4d12892f1781eea3a28e55d863390d4a363a3b3263"} Oct 02 12:42:09 crc kubenswrapper[4929]: I1002 12:42:09.383237 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502d94b5b4ba207a7fa58c4d12892f1781eea3a28e55d863390d4a363a3b3263" Oct 02 12:42:09 crc kubenswrapper[4929]: I1002 12:42:09.383334 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qps82" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.015494 4929 scope.go:117] "RemoveContainer" containerID="6e4983a49eff6dd55026594ee3059f3c1b20d756e9676a2e7ca9fbb42c6c7058" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.036861 4929 scope.go:117] "RemoveContainer" containerID="da01a8a5b101c3a223ff6a90f1a0b0866f68f94729910139031dfe1079fdfafe" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.096701 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2da6-account-create-4nhvz"] Oct 02 12:42:16 crc kubenswrapper[4929]: E1002 12:42:16.097139 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" containerName="mariadb-database-create" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.097161 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" containerName="mariadb-database-create" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.097397 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" containerName="mariadb-database-create" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.098156 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.100089 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.119482 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2da6-account-create-4nhvz"] Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.219343 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cx5\" (UniqueName: \"kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5\") pod \"glance-2da6-account-create-4nhvz\" (UID: \"1d9d4049-dac2-4fa5-abc5-e048122a0672\") " pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.321136 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cx5\" (UniqueName: \"kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5\") pod \"glance-2da6-account-create-4nhvz\" (UID: \"1d9d4049-dac2-4fa5-abc5-e048122a0672\") " pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.337557 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cx5\" (UniqueName: \"kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5\") pod \"glance-2da6-account-create-4nhvz\" (UID: \"1d9d4049-dac2-4fa5-abc5-e048122a0672\") " pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.450850 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:16 crc kubenswrapper[4929]: I1002 12:42:16.917847 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2da6-account-create-4nhvz"] Oct 02 12:42:17 crc kubenswrapper[4929]: I1002 12:42:17.447594 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da6-account-create-4nhvz" event={"ID":"1d9d4049-dac2-4fa5-abc5-e048122a0672","Type":"ContainerStarted","Data":"d31ae9d8ee9133aabd44db7e50e650ec7046111eff6296e8818f43699f8db055"} Oct 02 12:42:17 crc kubenswrapper[4929]: I1002 12:42:17.448043 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da6-account-create-4nhvz" event={"ID":"1d9d4049-dac2-4fa5-abc5-e048122a0672","Type":"ContainerStarted","Data":"22a389439e0c41a69874a07cdfc26a2a1b4852d6c0f0b42a36c482b13f703762"} Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.456173 4929 generic.go:334] "Generic (PLEG): container finished" podID="1d9d4049-dac2-4fa5-abc5-e048122a0672" containerID="d31ae9d8ee9133aabd44db7e50e650ec7046111eff6296e8818f43699f8db055" exitCode=0 Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.456509 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da6-account-create-4nhvz" event={"ID":"1d9d4049-dac2-4fa5-abc5-e048122a0672","Type":"ContainerDied","Data":"d31ae9d8ee9133aabd44db7e50e650ec7046111eff6296e8818f43699f8db055"} Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.744565 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.863078 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8cx5\" (UniqueName: \"kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5\") pod \"1d9d4049-dac2-4fa5-abc5-e048122a0672\" (UID: \"1d9d4049-dac2-4fa5-abc5-e048122a0672\") " Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.868373 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5" (OuterVolumeSpecName: "kube-api-access-r8cx5") pod "1d9d4049-dac2-4fa5-abc5-e048122a0672" (UID: "1d9d4049-dac2-4fa5-abc5-e048122a0672"). InnerVolumeSpecName "kube-api-access-r8cx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:18 crc kubenswrapper[4929]: I1002 12:42:18.965544 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8cx5\" (UniqueName: \"kubernetes.io/projected/1d9d4049-dac2-4fa5-abc5-e048122a0672-kube-api-access-r8cx5\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:19 crc kubenswrapper[4929]: I1002 12:42:19.467024 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2da6-account-create-4nhvz" event={"ID":"1d9d4049-dac2-4fa5-abc5-e048122a0672","Type":"ContainerDied","Data":"22a389439e0c41a69874a07cdfc26a2a1b4852d6c0f0b42a36c482b13f703762"} Oct 02 12:42:19 crc kubenswrapper[4929]: I1002 12:42:19.467306 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a389439e0c41a69874a07cdfc26a2a1b4852d6c0f0b42a36c482b13f703762" Oct 02 12:42:19 crc kubenswrapper[4929]: I1002 12:42:19.467090 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2da6-account-create-4nhvz" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.156637 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:42:21 crc kubenswrapper[4929]: E1002 12:42:21.157104 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.249037 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9plpw"] Oct 02 12:42:21 crc kubenswrapper[4929]: E1002 12:42:21.249507 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9d4049-dac2-4fa5-abc5-e048122a0672" containerName="mariadb-account-create" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.249530 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9d4049-dac2-4fa5-abc5-e048122a0672" containerName="mariadb-account-create" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.249774 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9d4049-dac2-4fa5-abc5-e048122a0672" containerName="mariadb-account-create" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.250525 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.253039 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ltb8t" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.253895 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.257029 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9plpw"] Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.412924 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.413041 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2ph\" (UniqueName: \"kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.413098 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.413201 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.514632 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.514687 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.514784 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.514865 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2ph\" (UniqueName: \"kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.519355 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.519746 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.528597 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.533214 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2ph\" (UniqueName: \"kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph\") pod \"glance-db-sync-9plpw\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:21 crc kubenswrapper[4929]: I1002 12:42:21.573096 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:22 crc kubenswrapper[4929]: I1002 12:42:22.083463 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9plpw"] Oct 02 12:42:22 crc kubenswrapper[4929]: I1002 12:42:22.493803 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9plpw" event={"ID":"28075288-21df-4796-ae55-feb4d7f91163","Type":"ContainerStarted","Data":"95f91e747f4d8b1450545ee7fe9c8a7e64a6b86477420701a9f98f4fcb9e3099"} Oct 02 12:42:23 crc kubenswrapper[4929]: I1002 12:42:23.507295 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9plpw" event={"ID":"28075288-21df-4796-ae55-feb4d7f91163","Type":"ContainerStarted","Data":"30362c07cff42ad908bb82c2f78e65f4fb77d025662016d1e5492496521dedd2"} Oct 02 12:42:23 crc kubenswrapper[4929]: I1002 12:42:23.528941 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9plpw" podStartSLOduration=2.528916255 podStartE2EDuration="2.528916255s" podCreationTimestamp="2025-10-02 12:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:23.520827793 +0000 UTC m=+5544.071194157" watchObservedRunningTime="2025-10-02 12:42:23.528916255 +0000 UTC m=+5544.079282619" Oct 02 12:42:26 crc kubenswrapper[4929]: I1002 12:42:26.541153 4929 generic.go:334] "Generic (PLEG): container finished" podID="28075288-21df-4796-ae55-feb4d7f91163" containerID="30362c07cff42ad908bb82c2f78e65f4fb77d025662016d1e5492496521dedd2" exitCode=0 Oct 02 12:42:26 crc kubenswrapper[4929]: I1002 12:42:26.541259 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9plpw" event={"ID":"28075288-21df-4796-ae55-feb4d7f91163","Type":"ContainerDied","Data":"30362c07cff42ad908bb82c2f78e65f4fb77d025662016d1e5492496521dedd2"} Oct 02 12:42:27 crc kubenswrapper[4929]: I1002 12:42:27.905888 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.025377 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data\") pod \"28075288-21df-4796-ae55-feb4d7f91163\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.025447 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx2ph\" (UniqueName: \"kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph\") pod \"28075288-21df-4796-ae55-feb4d7f91163\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.025481 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle\") pod \"28075288-21df-4796-ae55-feb4d7f91163\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.025541 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data\") pod \"28075288-21df-4796-ae55-feb4d7f91163\" (UID: \"28075288-21df-4796-ae55-feb4d7f91163\") " Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.032159 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "28075288-21df-4796-ae55-feb4d7f91163" (UID: "28075288-21df-4796-ae55-feb4d7f91163"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.032212 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph" (OuterVolumeSpecName: "kube-api-access-sx2ph") pod "28075288-21df-4796-ae55-feb4d7f91163" (UID: "28075288-21df-4796-ae55-feb4d7f91163"). InnerVolumeSpecName "kube-api-access-sx2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.070639 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28075288-21df-4796-ae55-feb4d7f91163" (UID: "28075288-21df-4796-ae55-feb4d7f91163"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.075923 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data" (OuterVolumeSpecName: "config-data") pod "28075288-21df-4796-ae55-feb4d7f91163" (UID: "28075288-21df-4796-ae55-feb4d7f91163"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.127657 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.127699 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx2ph\" (UniqueName: \"kubernetes.io/projected/28075288-21df-4796-ae55-feb4d7f91163-kube-api-access-sx2ph\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.127715 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.127727 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28075288-21df-4796-ae55-feb4d7f91163-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.563982 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9plpw" event={"ID":"28075288-21df-4796-ae55-feb4d7f91163","Type":"ContainerDied","Data":"95f91e747f4d8b1450545ee7fe9c8a7e64a6b86477420701a9f98f4fcb9e3099"} Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.564035 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95f91e747f4d8b1450545ee7fe9c8a7e64a6b86477420701a9f98f4fcb9e3099" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.564107 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9plpw" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.847906 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:28 crc kubenswrapper[4929]: E1002 12:42:28.848683 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28075288-21df-4796-ae55-feb4d7f91163" containerName="glance-db-sync" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.848708 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="28075288-21df-4796-ae55-feb4d7f91163" containerName="glance-db-sync" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.848915 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="28075288-21df-4796-ae55-feb4d7f91163" containerName="glance-db-sync" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.850031 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.854028 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.854280 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ltb8t" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.854566 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.854659 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.879836 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.942213 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.943845 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.943936 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.943998 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpn4j\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.944025 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.944182 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.944273 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.944344 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.946572 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:28 crc kubenswrapper[4929]: I1002 12:42:28.958636 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.041337 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.042926 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045243 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045552 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045602 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045629 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045675 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045699 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045726 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87bf\" (UniqueName: \"kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045751 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045768 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045788 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045823 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpn4j\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045841 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.045860 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.054981 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.057504 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.057504 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.059025 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.059901 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.066718 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.069451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.092472 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpn4j\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j\") pod \"glance-default-external-api-0\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147194 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147270 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147457 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147549 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87bf\" (UniqueName: \"kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147571 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9k9\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147623 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147660 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147735 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147779 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147812 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.147838 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.149132 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.149356 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.149537 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.149648 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.166320 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87bf\" (UniqueName: \"kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf\") pod \"dnsmasq-dns-84bcd9f45f-ckxz8\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.181901 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.251221 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.251580 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.251752 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.251844 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.252003 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.252080 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9k9\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.252177 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.254883 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.255204 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.257189 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.258655 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.263174 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.267465 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.272565 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.275045 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9k9\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9\") pod \"glance-default-internal-api-0\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.464074 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.841914 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:42:29 crc kubenswrapper[4929]: I1002 12:42:29.922661 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.001643 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.658763 4929 generic.go:334] "Generic (PLEG): container finished" podID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerID="c9579f1c0623d2330c097ab92fe29af08eb511c5a730495022957d926f40b439" exitCode=0 Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.659439 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" event={"ID":"99ebc735-e9a7-4713-acfb-5fcf1c091db7","Type":"ContainerDied","Data":"c9579f1c0623d2330c097ab92fe29af08eb511c5a730495022957d926f40b439"} Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.659476 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" event={"ID":"99ebc735-e9a7-4713-acfb-5fcf1c091db7","Type":"ContainerStarted","Data":"4961dc2476f049ef6cc591fcee1e0305fd47de86c477c9cffde8ef3d8b2e8f37"} Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.676680 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerStarted","Data":"51b61c84979f5db6b55a04af0fd1a3a14053de7a3c32ce291e3c08a7322c2cde"} Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.696899 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerStarted","Data":"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3"} Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.697005 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerStarted","Data":"436a4ba1bcc946585d42435abb24f79c6ff4eadaddc98b98285e4b4c0371425e"} Oct 02 12:42:30 crc kubenswrapper[4929]: I1002 12:42:30.739492 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.706348 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" event={"ID":"99ebc735-e9a7-4713-acfb-5fcf1c091db7","Type":"ContainerStarted","Data":"0eaed74e6a15055d5d32302c5269f13970bc413e2bd1003e855b508594c7ad07"} Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.706662 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.709170 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerStarted","Data":"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502"} Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.709211 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerStarted","Data":"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586"} Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.711384 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerStarted","Data":"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861"} Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.711501 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-log" containerID="cri-o://c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" gracePeriod=30 Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.711509 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-httpd" containerID="cri-o://c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" gracePeriod=30 Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.726069 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" podStartSLOduration=3.726048924 podStartE2EDuration="3.726048924s" podCreationTimestamp="2025-10-02 12:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:31.723378647 +0000 UTC m=+5552.273745021" watchObservedRunningTime="2025-10-02 12:42:31.726048924 +0000 UTC m=+5552.276415278" Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.756139 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.756119301 podStartE2EDuration="2.756119301s" podCreationTimestamp="2025-10-02 12:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:31.744545507 +0000 UTC m=+5552.294911871" watchObservedRunningTime="2025-10-02 12:42:31.756119301 +0000 UTC m=+5552.306485675" Oct 02 12:42:31 crc kubenswrapper[4929]: I1002 12:42:31.768188 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.7681603580000003 podStartE2EDuration="3.768160358s" podCreationTimestamp="2025-10-02 12:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:31.763458243 +0000 UTC m=+5552.313824627" watchObservedRunningTime="2025-10-02 12:42:31.768160358 +0000 UTC m=+5552.318526722" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.407875 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.535340 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpn4j\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.535738 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536038 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536081 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536112 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536132 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536170 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle\") pod \"e6d266ef-eafc-4c02-8706-9d8317f36f48\" (UID: \"e6d266ef-eafc-4c02-8706-9d8317f36f48\") " Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536541 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs" (OuterVolumeSpecName: "logs") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.536628 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.542639 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j" (OuterVolumeSpecName: "kube-api-access-xpn4j") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "kube-api-access-xpn4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.543818 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts" (OuterVolumeSpecName: "scripts") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.550245 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph" (OuterVolumeSpecName: "ceph") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.563634 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.589425 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data" (OuterVolumeSpecName: "config-data") pod "e6d266ef-eafc-4c02-8706-9d8317f36f48" (UID: "e6d266ef-eafc-4c02-8706-9d8317f36f48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638218 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638276 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638291 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638304 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d266ef-eafc-4c02-8706-9d8317f36f48-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638314 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d266ef-eafc-4c02-8706-9d8317f36f48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638329 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpn4j\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-kube-api-access-xpn4j\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.638340 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e6d266ef-eafc-4c02-8706-9d8317f36f48-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.721252 4929 generic.go:334] "Generic (PLEG): container finished" podID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerID="c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" exitCode=0 Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.721283 4929 generic.go:334] "Generic (PLEG): container finished" podID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerID="c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" exitCode=143 Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.722106 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerDied","Data":"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861"} Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.722168 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerDied","Data":"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3"} Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.722182 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6d266ef-eafc-4c02-8706-9d8317f36f48","Type":"ContainerDied","Data":"436a4ba1bcc946585d42435abb24f79c6ff4eadaddc98b98285e4b4c0371425e"} Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.722203 4929 scope.go:117] "RemoveContainer" containerID="c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.722425 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.765921 4929 scope.go:117] "RemoveContainer" containerID="c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.776071 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.786463 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.794270 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:32 crc kubenswrapper[4929]: E1002 12:42:32.795278 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-httpd" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.795299 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-httpd" Oct 02 12:42:32 crc kubenswrapper[4929]: E1002 12:42:32.795328 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-log" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.795336 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-log" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.795598 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-httpd" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.795627 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" containerName="glance-log" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.798213 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.801340 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.809364 4929 scope.go:117] "RemoveContainer" containerID="c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" Oct 02 12:42:32 crc kubenswrapper[4929]: E1002 12:42:32.810255 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861\": container with ID starting with c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861 not found: ID does not exist" containerID="c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.810289 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861"} err="failed to get container status \"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861\": rpc error: code = NotFound desc = could not find container \"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861\": container with ID starting with c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861 not found: ID does not exist" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.810326 4929 scope.go:117] "RemoveContainer" containerID="c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" Oct 02 12:42:32 crc kubenswrapper[4929]: E1002 12:42:32.810874 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3\": container with ID starting with c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3 not found: ID does not exist" containerID="c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.810903 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3"} err="failed to get container status \"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3\": rpc error: code = NotFound desc = could not find container \"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3\": container with ID starting with c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3 not found: ID does not exist" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.810921 4929 scope.go:117] "RemoveContainer" containerID="c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.814331 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861"} err="failed to get container status \"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861\": rpc error: code = NotFound desc = could not find container \"c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861\": container with ID starting with c04a95e61af63ec643ae8b706c057d07060e980574c4d6aed87e8edf4a6df861 not found: ID does not exist" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.814401 4929 scope.go:117] "RemoveContainer" containerID="c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.818687 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3"} err="failed to get container status \"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3\": rpc error: code = NotFound desc = could not find container \"c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3\": container with ID starting with c57518e2bbacf3b76813ba7977aa5e1fcaf0888c0648097a0f5b2d7f6b2309e3 not found: ID does not exist" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.843793 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.944621 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.944897 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.945061 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.945153 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.945241 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.945388 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpvc\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:32 crc kubenswrapper[4929]: I1002 12:42:32.945472 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.046810 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.047906 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.048327 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.048512 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpvc\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.048588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.048732 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.048849 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.050017 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.050337 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.051993 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.052010 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.053049 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.054517 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.065992 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpvc\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc\") pod \"glance-default-external-api-0\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.120889 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.213991 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.496620 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:42:33 crc kubenswrapper[4929]: W1002 12:42:33.501090 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74994619_1056_48dc_aece_0539c1a9ec0f.slice/crio-d42e1c7448df57fa43aaeec1aca65b57f785f65ebc368254de8c201007b9b906 WatchSource:0}: Error finding container d42e1c7448df57fa43aaeec1aca65b57f785f65ebc368254de8c201007b9b906: Status 404 returned error can't find the container with id d42e1c7448df57fa43aaeec1aca65b57f785f65ebc368254de8c201007b9b906 Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.738314 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerStarted","Data":"d42e1c7448df57fa43aaeec1aca65b57f785f65ebc368254de8c201007b9b906"} Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.741450 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-log" containerID="cri-o://42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" gracePeriod=30 Oct 02 12:42:33 crc kubenswrapper[4929]: I1002 12:42:33.742051 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-httpd" containerID="cri-o://c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" gracePeriod=30 Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.170260 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d266ef-eafc-4c02-8706-9d8317f36f48" path="/var/lib/kubelet/pods/e6d266ef-eafc-4c02-8706-9d8317f36f48/volumes" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.383227 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.475777 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.475837 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.475874 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.475977 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.476075 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.476113 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.476134 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9k9\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9\") pod \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\" (UID: \"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9\") " Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.476446 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs" (OuterVolumeSpecName: "logs") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.477152 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.480991 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts" (OuterVolumeSpecName: "scripts") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.481088 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9" (OuterVolumeSpecName: "kube-api-access-2f9k9") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "kube-api-access-2f9k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.489292 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph" (OuterVolumeSpecName: "ceph") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.501340 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.531751 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data" (OuterVolumeSpecName: "config-data") pod "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" (UID: "34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577880 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577916 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577925 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577934 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9k9\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-kube-api-access-2f9k9\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577945 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577978 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.577991 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.755163 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerStarted","Data":"300b7b8748f8e764df3c9750de14ac3a0466602dd35dd8d5ff84ec3876f4ba79"} Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759399 4929 generic.go:334] "Generic (PLEG): container finished" podID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerID="c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" exitCode=0 Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759468 4929 generic.go:334] "Generic (PLEG): container finished" podID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerID="42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" exitCode=143 Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759500 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerDied","Data":"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502"} Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759558 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerDied","Data":"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586"} Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759579 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9","Type":"ContainerDied","Data":"51b61c84979f5db6b55a04af0fd1a3a14053de7a3c32ce291e3c08a7322c2cde"} Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759600 4929 scope.go:117] "RemoveContainer" containerID="c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.759979 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.803268 4929 scope.go:117] "RemoveContainer" containerID="42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.808782 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.823290 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.837112 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:34 crc kubenswrapper[4929]: E1002 12:42:34.837571 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-httpd" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.837583 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-httpd" Oct 02 12:42:34 crc kubenswrapper[4929]: E1002 12:42:34.837620 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-log" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.837627 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-log" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.837815 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-log" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.837831 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" containerName="glance-httpd" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.838881 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.842745 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.849827 4929 scope.go:117] "RemoveContainer" containerID="c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" Oct 02 12:42:34 crc kubenswrapper[4929]: E1002 12:42:34.853590 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502\": container with ID starting with c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502 not found: ID does not exist" containerID="c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.853636 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502"} err="failed to get container status \"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502\": rpc error: code = NotFound desc = could not find container \"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502\": container with ID starting with c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502 not found: ID does not exist" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.853664 4929 scope.go:117] "RemoveContainer" containerID="42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.855006 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:34 crc kubenswrapper[4929]: E1002 12:42:34.875051 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586\": container with ID starting with 42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586 not found: ID does not exist" containerID="42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.875095 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586"} err="failed to get container status \"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586\": rpc error: code = NotFound desc = could not find container \"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586\": container with ID starting with 42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586 not found: ID does not exist" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.875123 4929 scope.go:117] "RemoveContainer" containerID="c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.876970 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502"} err="failed to get container status \"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502\": rpc error: code = NotFound desc = could not find container \"c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502\": container with ID starting with c23f1c831f711f5f17371e41eaf69ee4aadcda31fff36b6086a8667e120c8502 not found: ID does not exist" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.876999 4929 scope.go:117] "RemoveContainer" containerID="42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.877330 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586"} err="failed to get container status \"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586\": rpc error: code = NotFound desc = could not find container \"42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586\": container with ID starting with 42f0d2f9e83681e7d57e95c396e266091ab8720c4204b4efc37f8b706462d586 not found: ID does not exist" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985503 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985578 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985655 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985744 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985830 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckxd\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:34 crc kubenswrapper[4929]: I1002 12:42:34.985919 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088079 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088138 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nckxd\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088181 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088268 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088343 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.088401 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.089028 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.089036 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.093933 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.095062 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.101517 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.105741 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.121429 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckxd\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd\") pod \"glance-default-internal-api-0\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.157117 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:42:35 crc kubenswrapper[4929]: E1002 12:42:35.157500 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.163417 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.714209 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.768786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerStarted","Data":"f45a3dab6740cca432cca97ef98624da4d4a1d3ebd1ecbb2496f7941a21ba577"} Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.769778 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerStarted","Data":"dbea50f9e17f6ef323077d0a35c60452f94ec335255720cec82c328e5b5ab0da"} Oct 02 12:42:35 crc kubenswrapper[4929]: I1002 12:42:35.786607 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.786583555 podStartE2EDuration="3.786583555s" podCreationTimestamp="2025-10-02 12:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:35.784238887 +0000 UTC m=+5556.334605251" watchObservedRunningTime="2025-10-02 12:42:35.786583555 +0000 UTC m=+5556.336949929" Oct 02 12:42:36 crc kubenswrapper[4929]: I1002 12:42:36.170194 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9" path="/var/lib/kubelet/pods/34bfcb3e-bdf9-4ab3-b4e2-c341bc562bc9/volumes" Oct 02 12:42:36 crc kubenswrapper[4929]: I1002 12:42:36.782248 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerStarted","Data":"d094528a48d858baf682d85fad791241c6657f72c3cfe62385d176e9b62435d5"} Oct 02 12:42:37 crc kubenswrapper[4929]: I1002 12:42:37.793810 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerStarted","Data":"5b3ed6ca35f0875fefeb2e2c714434a74c61fea5d9c263e57f3a282989f4c939"} Oct 02 12:42:37 crc kubenswrapper[4929]: I1002 12:42:37.820539 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.8205178650000002 podStartE2EDuration="3.820517865s" podCreationTimestamp="2025-10-02 12:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:37.815723677 +0000 UTC m=+5558.366090061" watchObservedRunningTime="2025-10-02 12:42:37.820517865 +0000 UTC m=+5558.370884229" Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.274149 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.334537 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.334778 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="dnsmasq-dns" containerID="cri-o://2e7881f32499e05fb1678074b477846d647205c96e4702c4d86791d0e2a847ff" gracePeriod=10 Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.820907 4929 generic.go:334] "Generic (PLEG): container finished" podID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerID="2e7881f32499e05fb1678074b477846d647205c96e4702c4d86791d0e2a847ff" exitCode=0 Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.820999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" event={"ID":"2b4b3e26-83a9-4640-866e-17e037bdbbf9","Type":"ContainerDied","Data":"2e7881f32499e05fb1678074b477846d647205c96e4702c4d86791d0e2a847ff"} Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.821068 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" event={"ID":"2b4b3e26-83a9-4640-866e-17e037bdbbf9","Type":"ContainerDied","Data":"848502dbff8e1d8bdc88495ef5d5d1a691510236ebceb637baab705a031bc7f4"} Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.821081 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="848502dbff8e1d8bdc88495ef5d5d1a691510236ebceb637baab705a031bc7f4" Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.855335 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.990517 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc\") pod \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.990585 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb\") pod \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.990684 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config\") pod \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.990750 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb\") pod \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " Oct 02 12:42:39 crc kubenswrapper[4929]: I1002 12:42:39.990835 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8zfb\" (UniqueName: \"kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb\") pod \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\" (UID: \"2b4b3e26-83a9-4640-866e-17e037bdbbf9\") " Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.008339 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb" (OuterVolumeSpecName: "kube-api-access-h8zfb") pod "2b4b3e26-83a9-4640-866e-17e037bdbbf9" (UID: "2b4b3e26-83a9-4640-866e-17e037bdbbf9"). InnerVolumeSpecName "kube-api-access-h8zfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.055688 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config" (OuterVolumeSpecName: "config") pod "2b4b3e26-83a9-4640-866e-17e037bdbbf9" (UID: "2b4b3e26-83a9-4640-866e-17e037bdbbf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.085518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b4b3e26-83a9-4640-866e-17e037bdbbf9" (UID: "2b4b3e26-83a9-4640-866e-17e037bdbbf9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.096222 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.096474 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.096560 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8zfb\" (UniqueName: \"kubernetes.io/projected/2b4b3e26-83a9-4640-866e-17e037bdbbf9-kube-api-access-h8zfb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.103518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b4b3e26-83a9-4640-866e-17e037bdbbf9" (UID: "2b4b3e26-83a9-4640-866e-17e037bdbbf9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.125407 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b4b3e26-83a9-4640-866e-17e037bdbbf9" (UID: "2b4b3e26-83a9-4640-866e-17e037bdbbf9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.198172 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.198213 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b4b3e26-83a9-4640-866e-17e037bdbbf9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.834795 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b84f67b9c-svnjr" Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.861260 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:42:40 crc kubenswrapper[4929]: I1002 12:42:40.868563 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b84f67b9c-svnjr"] Oct 02 12:42:42 crc kubenswrapper[4929]: I1002 12:42:42.167059 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" path="/var/lib/kubelet/pods/2b4b3e26-83a9-4640-866e-17e037bdbbf9/volumes" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.121075 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.123039 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.148251 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.158575 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.863392 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:42:43 crc kubenswrapper[4929]: I1002 12:42:43.863455 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.164620 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.164675 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.196400 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.207729 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.798144 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.811694 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.880598 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:45 crc kubenswrapper[4929]: I1002 12:42:45.880660 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:47 crc kubenswrapper[4929]: I1002 12:42:47.799293 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:47 crc kubenswrapper[4929]: I1002 12:42:47.816171 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:42:50 crc kubenswrapper[4929]: I1002 12:42:50.161827 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:42:50 crc kubenswrapper[4929]: I1002 12:42:50.927825 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a"} Oct 02 12:42:55 crc kubenswrapper[4929]: I1002 12:42:55.988457 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p528n"] Oct 02 12:42:55 crc kubenswrapper[4929]: E1002 12:42:55.989361 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="dnsmasq-dns" Oct 02 12:42:55 crc kubenswrapper[4929]: I1002 12:42:55.989374 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="dnsmasq-dns" Oct 02 12:42:55 crc kubenswrapper[4929]: E1002 12:42:55.989393 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="init" Oct 02 12:42:55 crc kubenswrapper[4929]: I1002 12:42:55.989398 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="init" Oct 02 12:42:55 crc kubenswrapper[4929]: I1002 12:42:55.989570 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4b3e26-83a9-4640-866e-17e037bdbbf9" containerName="dnsmasq-dns" Oct 02 12:42:55 crc kubenswrapper[4929]: I1002 12:42:55.990180 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p528n" Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.004762 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p528n"] Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.090124 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfml\" (UniqueName: \"kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml\") pod \"placement-db-create-p528n\" (UID: \"8ca64815-0edf-4d20-aa7f-386f89c5f1e2\") " pod="openstack/placement-db-create-p528n" Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.191307 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfml\" (UniqueName: \"kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml\") pod \"placement-db-create-p528n\" (UID: \"8ca64815-0edf-4d20-aa7f-386f89c5f1e2\") " pod="openstack/placement-db-create-p528n" Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.218988 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfml\" (UniqueName: \"kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml\") pod \"placement-db-create-p528n\" (UID: \"8ca64815-0edf-4d20-aa7f-386f89c5f1e2\") " pod="openstack/placement-db-create-p528n" Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.310249 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p528n" Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.762371 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p528n"] Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.975357 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p528n" event={"ID":"8ca64815-0edf-4d20-aa7f-386f89c5f1e2","Type":"ContainerStarted","Data":"480967e8cb034ad36623d07ca3bb12ec979c51f4eeea40530cb1f61007bf6da3"} Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.975849 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p528n" event={"ID":"8ca64815-0edf-4d20-aa7f-386f89c5f1e2","Type":"ContainerStarted","Data":"4f6fb9a0fd8c140e0410f2387dcb5a3d0079899a3e7465208ecc81c204e8e7e1"} Oct 02 12:42:56 crc kubenswrapper[4929]: I1002 12:42:56.990577 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-p528n" podStartSLOduration=1.9905582929999999 podStartE2EDuration="1.990558293s" podCreationTimestamp="2025-10-02 12:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:56.988422081 +0000 UTC m=+5577.538788445" watchObservedRunningTime="2025-10-02 12:42:56.990558293 +0000 UTC m=+5577.540924657" Oct 02 12:42:57 crc kubenswrapper[4929]: I1002 12:42:57.985607 4929 generic.go:334] "Generic (PLEG): container finished" podID="8ca64815-0edf-4d20-aa7f-386f89c5f1e2" containerID="480967e8cb034ad36623d07ca3bb12ec979c51f4eeea40530cb1f61007bf6da3" exitCode=0 Oct 02 12:42:57 crc kubenswrapper[4929]: I1002 12:42:57.985658 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p528n" event={"ID":"8ca64815-0edf-4d20-aa7f-386f89c5f1e2","Type":"ContainerDied","Data":"480967e8cb034ad36623d07ca3bb12ec979c51f4eeea40530cb1f61007bf6da3"} Oct 02 12:42:59 crc kubenswrapper[4929]: I1002 12:42:59.318364 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p528n" Oct 02 12:42:59 crc kubenswrapper[4929]: I1002 12:42:59.448708 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dfml\" (UniqueName: \"kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml\") pod \"8ca64815-0edf-4d20-aa7f-386f89c5f1e2\" (UID: \"8ca64815-0edf-4d20-aa7f-386f89c5f1e2\") " Oct 02 12:42:59 crc kubenswrapper[4929]: I1002 12:42:59.455869 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml" (OuterVolumeSpecName: "kube-api-access-6dfml") pod "8ca64815-0edf-4d20-aa7f-386f89c5f1e2" (UID: "8ca64815-0edf-4d20-aa7f-386f89c5f1e2"). InnerVolumeSpecName "kube-api-access-6dfml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:59 crc kubenswrapper[4929]: I1002 12:42:59.551757 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dfml\" (UniqueName: \"kubernetes.io/projected/8ca64815-0edf-4d20-aa7f-386f89c5f1e2-kube-api-access-6dfml\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:00 crc kubenswrapper[4929]: I1002 12:43:00.005665 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p528n" event={"ID":"8ca64815-0edf-4d20-aa7f-386f89c5f1e2","Type":"ContainerDied","Data":"4f6fb9a0fd8c140e0410f2387dcb5a3d0079899a3e7465208ecc81c204e8e7e1"} Oct 02 12:43:00 crc kubenswrapper[4929]: I1002 12:43:00.005705 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6fb9a0fd8c140e0410f2387dcb5a3d0079899a3e7465208ecc81c204e8e7e1" Oct 02 12:43:00 crc kubenswrapper[4929]: I1002 12:43:00.006432 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p528n" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.074917 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76f2-account-create-x49sr"] Oct 02 12:43:06 crc kubenswrapper[4929]: E1002 12:43:06.075896 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca64815-0edf-4d20-aa7f-386f89c5f1e2" containerName="mariadb-database-create" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.075912 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca64815-0edf-4d20-aa7f-386f89c5f1e2" containerName="mariadb-database-create" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.076149 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca64815-0edf-4d20-aa7f-386f89c5f1e2" containerName="mariadb-database-create" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.076875 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.078438 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.083504 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f2-account-create-x49sr"] Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.165496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx5n\" (UniqueName: \"kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n\") pod \"placement-76f2-account-create-x49sr\" (UID: \"79e04b99-9193-46d9-9911-2ea9d88d0cc1\") " pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.266704 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx5n\" (UniqueName: \"kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n\") pod \"placement-76f2-account-create-x49sr\" (UID: \"79e04b99-9193-46d9-9911-2ea9d88d0cc1\") " pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.295346 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx5n\" (UniqueName: \"kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n\") pod \"placement-76f2-account-create-x49sr\" (UID: \"79e04b99-9193-46d9-9911-2ea9d88d0cc1\") " pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.399087 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:06 crc kubenswrapper[4929]: I1002 12:43:06.817758 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76f2-account-create-x49sr"] Oct 02 12:43:07 crc kubenswrapper[4929]: I1002 12:43:07.073866 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f2-account-create-x49sr" event={"ID":"79e04b99-9193-46d9-9911-2ea9d88d0cc1","Type":"ContainerStarted","Data":"06669eb5e32fa6abbc08f9693688c2ff9ac4da00ee276f8e87f8e0916ea73e51"} Oct 02 12:43:07 crc kubenswrapper[4929]: I1002 12:43:07.074579 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f2-account-create-x49sr" event={"ID":"79e04b99-9193-46d9-9911-2ea9d88d0cc1","Type":"ContainerStarted","Data":"76c13b222bc7b82fd893e891d3b97bf23137412c589f4c61911366c2947dd3fa"} Oct 02 12:43:07 crc kubenswrapper[4929]: I1002 12:43:07.099243 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76f2-account-create-x49sr" podStartSLOduration=1.099219499 podStartE2EDuration="1.099219499s" podCreationTimestamp="2025-10-02 12:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:43:07.094087601 +0000 UTC m=+5587.644453985" watchObservedRunningTime="2025-10-02 12:43:07.099219499 +0000 UTC m=+5587.649585863" Oct 02 12:43:08 crc kubenswrapper[4929]: I1002 12:43:08.084745 4929 generic.go:334] "Generic (PLEG): container finished" podID="79e04b99-9193-46d9-9911-2ea9d88d0cc1" containerID="06669eb5e32fa6abbc08f9693688c2ff9ac4da00ee276f8e87f8e0916ea73e51" exitCode=0 Oct 02 12:43:08 crc kubenswrapper[4929]: I1002 12:43:08.084870 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f2-account-create-x49sr" event={"ID":"79e04b99-9193-46d9-9911-2ea9d88d0cc1","Type":"ContainerDied","Data":"06669eb5e32fa6abbc08f9693688c2ff9ac4da00ee276f8e87f8e0916ea73e51"} Oct 02 12:43:09 crc kubenswrapper[4929]: I1002 12:43:09.444015 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:09 crc kubenswrapper[4929]: I1002 12:43:09.527456 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfx5n\" (UniqueName: \"kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n\") pod \"79e04b99-9193-46d9-9911-2ea9d88d0cc1\" (UID: \"79e04b99-9193-46d9-9911-2ea9d88d0cc1\") " Oct 02 12:43:09 crc kubenswrapper[4929]: I1002 12:43:09.533453 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n" (OuterVolumeSpecName: "kube-api-access-rfx5n") pod "79e04b99-9193-46d9-9911-2ea9d88d0cc1" (UID: "79e04b99-9193-46d9-9911-2ea9d88d0cc1"). InnerVolumeSpecName "kube-api-access-rfx5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:43:09 crc kubenswrapper[4929]: I1002 12:43:09.630885 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfx5n\" (UniqueName: \"kubernetes.io/projected/79e04b99-9193-46d9-9911-2ea9d88d0cc1-kube-api-access-rfx5n\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:10 crc kubenswrapper[4929]: I1002 12:43:10.106280 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76f2-account-create-x49sr" event={"ID":"79e04b99-9193-46d9-9911-2ea9d88d0cc1","Type":"ContainerDied","Data":"76c13b222bc7b82fd893e891d3b97bf23137412c589f4c61911366c2947dd3fa"} Oct 02 12:43:10 crc kubenswrapper[4929]: I1002 12:43:10.106333 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76f2-account-create-x49sr" Oct 02 12:43:10 crc kubenswrapper[4929]: I1002 12:43:10.106351 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c13b222bc7b82fd893e891d3b97bf23137412c589f4c61911366c2947dd3fa" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.284365 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:43:11 crc kubenswrapper[4929]: E1002 12:43:11.285127 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e04b99-9193-46d9-9911-2ea9d88d0cc1" containerName="mariadb-account-create" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.285143 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e04b99-9193-46d9-9911-2ea9d88d0cc1" containerName="mariadb-account-create" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.285355 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e04b99-9193-46d9-9911-2ea9d88d0cc1" containerName="mariadb-account-create" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.286558 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.310897 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.349576 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nfwkm"] Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.351747 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.353605 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f7hqc" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.354095 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.358920 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369235 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfwkm"] Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369311 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369413 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d986\" (UniqueName: \"kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369441 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369474 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.369528 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.487950 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d986\" (UniqueName: \"kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488367 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488418 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488491 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488644 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488686 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488777 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pq9b\" (UniqueName: \"kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.488952 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.489119 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.490010 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.497500 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.500180 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.501261 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.512207 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d986\" (UniqueName: \"kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986\") pod \"dnsmasq-dns-56b4f4cfbf-b59xq\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.590957 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.591789 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.592158 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.592266 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.592389 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pq9b\" (UniqueName: \"kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.592487 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.595035 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.595936 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.599025 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.611887 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pq9b\" (UniqueName: \"kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b\") pod \"placement-db-sync-nfwkm\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.612542 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:11 crc kubenswrapper[4929]: I1002 12:43:11.685755 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:12 crc kubenswrapper[4929]: I1002 12:43:12.073751 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:43:12 crc kubenswrapper[4929]: W1002 12:43:12.084934 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice/crio-97ad239bf612d5d3f2803c6a25678dc27c3bf1cd062e9f76f0c01152cdb71b56 WatchSource:0}: Error finding container 97ad239bf612d5d3f2803c6a25678dc27c3bf1cd062e9f76f0c01152cdb71b56: Status 404 returned error can't find the container with id 97ad239bf612d5d3f2803c6a25678dc27c3bf1cd062e9f76f0c01152cdb71b56 Oct 02 12:43:12 crc kubenswrapper[4929]: I1002 12:43:12.128397 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" event={"ID":"79f52949-db23-4922-adf0-eb1122fe74a5","Type":"ContainerStarted","Data":"97ad239bf612d5d3f2803c6a25678dc27c3bf1cd062e9f76f0c01152cdb71b56"} Oct 02 12:43:12 crc kubenswrapper[4929]: I1002 12:43:12.199730 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfwkm"] Oct 02 12:43:12 crc kubenswrapper[4929]: W1002 12:43:12.229202 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod084f7e37_6fec_4747_9bc6_c1e0bff98ab1.slice/crio-54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877 WatchSource:0}: Error finding container 54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877: Status 404 returned error can't find the container with id 54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877 Oct 02 12:43:13 crc kubenswrapper[4929]: I1002 12:43:13.138417 4929 generic.go:334] "Generic (PLEG): container finished" podID="79f52949-db23-4922-adf0-eb1122fe74a5" containerID="2c9e3924094901738b5b42f2d198b3a80e24ebd1bd6c40e5d0c395166ca38d9f" exitCode=0 Oct 02 12:43:13 crc kubenswrapper[4929]: I1002 12:43:13.138922 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" event={"ID":"79f52949-db23-4922-adf0-eb1122fe74a5","Type":"ContainerDied","Data":"2c9e3924094901738b5b42f2d198b3a80e24ebd1bd6c40e5d0c395166ca38d9f"} Oct 02 12:43:13 crc kubenswrapper[4929]: I1002 12:43:13.142000 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfwkm" event={"ID":"084f7e37-6fec-4747-9bc6-c1e0bff98ab1","Type":"ContainerStarted","Data":"9f95995e2b136312306021883267e372280cb3d03bf64817eab5d03336f9b74b"} Oct 02 12:43:13 crc kubenswrapper[4929]: I1002 12:43:13.142039 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfwkm" event={"ID":"084f7e37-6fec-4747-9bc6-c1e0bff98ab1","Type":"ContainerStarted","Data":"54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877"} Oct 02 12:43:13 crc kubenswrapper[4929]: I1002 12:43:13.188604 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nfwkm" podStartSLOduration=2.1885835240000002 podStartE2EDuration="2.188583524s" podCreationTimestamp="2025-10-02 12:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:43:13.185181806 +0000 UTC m=+5593.735548170" watchObservedRunningTime="2025-10-02 12:43:13.188583524 +0000 UTC m=+5593.738949888" Oct 02 12:43:14 crc kubenswrapper[4929]: I1002 12:43:14.155459 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" event={"ID":"79f52949-db23-4922-adf0-eb1122fe74a5","Type":"ContainerStarted","Data":"b40a250f80ae363a7b44880a9f84e07de3cd45804238dc2868f5a718ec1da41f"} Oct 02 12:43:14 crc kubenswrapper[4929]: I1002 12:43:14.156931 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:14 crc kubenswrapper[4929]: I1002 12:43:14.179166 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" podStartSLOduration=3.179150422 podStartE2EDuration="3.179150422s" podCreationTimestamp="2025-10-02 12:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:43:14.175452036 +0000 UTC m=+5594.725818440" watchObservedRunningTime="2025-10-02 12:43:14.179150422 +0000 UTC m=+5594.729516786" Oct 02 12:43:16 crc kubenswrapper[4929]: I1002 12:43:16.140783 4929 scope.go:117] "RemoveContainer" containerID="13c4330041e02e2d3710ba5e08cebadcfb218ad87466918f5702c4a5ab534e97" Oct 02 12:43:16 crc kubenswrapper[4929]: I1002 12:43:16.197907 4929 scope.go:117] "RemoveContainer" containerID="70dfc92cff4800037a88feb3dd0f80a1ac32805dd1374353bb74f71d693bc5c7" Oct 02 12:43:18 crc kubenswrapper[4929]: I1002 12:43:18.201686 4929 generic.go:334] "Generic (PLEG): container finished" podID="084f7e37-6fec-4747-9bc6-c1e0bff98ab1" containerID="9f95995e2b136312306021883267e372280cb3d03bf64817eab5d03336f9b74b" exitCode=0 Oct 02 12:43:18 crc kubenswrapper[4929]: I1002 12:43:18.201752 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfwkm" event={"ID":"084f7e37-6fec-4747-9bc6-c1e0bff98ab1","Type":"ContainerDied","Data":"9f95995e2b136312306021883267e372280cb3d03bf64817eab5d03336f9b74b"} Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.565426 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.646782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts\") pod \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.646870 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle\") pod \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.646915 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pq9b\" (UniqueName: \"kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b\") pod \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.646971 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data\") pod \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.647047 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs\") pod \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\" (UID: \"084f7e37-6fec-4747-9bc6-c1e0bff98ab1\") " Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.647919 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs" (OuterVolumeSpecName: "logs") pod "084f7e37-6fec-4747-9bc6-c1e0bff98ab1" (UID: "084f7e37-6fec-4747-9bc6-c1e0bff98ab1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.677124 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts" (OuterVolumeSpecName: "scripts") pod "084f7e37-6fec-4747-9bc6-c1e0bff98ab1" (UID: "084f7e37-6fec-4747-9bc6-c1e0bff98ab1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.694407 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b" (OuterVolumeSpecName: "kube-api-access-2pq9b") pod "084f7e37-6fec-4747-9bc6-c1e0bff98ab1" (UID: "084f7e37-6fec-4747-9bc6-c1e0bff98ab1"). InnerVolumeSpecName "kube-api-access-2pq9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.702283 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data" (OuterVolumeSpecName: "config-data") pod "084f7e37-6fec-4747-9bc6-c1e0bff98ab1" (UID: "084f7e37-6fec-4747-9bc6-c1e0bff98ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.749350 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pq9b\" (UniqueName: \"kubernetes.io/projected/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-kube-api-access-2pq9b\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.749388 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.749399 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.749409 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.789206 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "084f7e37-6fec-4747-9bc6-c1e0bff98ab1" (UID: "084f7e37-6fec-4747-9bc6-c1e0bff98ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:19 crc kubenswrapper[4929]: I1002 12:43:19.850854 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084f7e37-6fec-4747-9bc6-c1e0bff98ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.220621 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfwkm" event={"ID":"084f7e37-6fec-4747-9bc6-c1e0bff98ab1","Type":"ContainerDied","Data":"54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877"} Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.220672 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f7690cd9aace779ecbec9473a284c186c9a35b6ad5b531f81cbe7925d7c877" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.220687 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfwkm" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.318876 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f684d7d76-vbzgj"] Oct 02 12:43:20 crc kubenswrapper[4929]: E1002 12:43:20.319242 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084f7e37-6fec-4747-9bc6-c1e0bff98ab1" containerName="placement-db-sync" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.319256 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="084f7e37-6fec-4747-9bc6-c1e0bff98ab1" containerName="placement-db-sync" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.319430 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="084f7e37-6fec-4747-9bc6-c1e0bff98ab1" containerName="placement-db-sync" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.320317 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.322371 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.322454 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.323078 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-f7hqc" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.345001 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f684d7d76-vbzgj"] Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.360256 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-combined-ca-bundle\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.360329 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-scripts\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.360389 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-config-data\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.360439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f87c45ef-7b90-4dde-ada1-820233761ec1-logs\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.360488 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47ps\" (UniqueName: \"kubernetes.io/projected/f87c45ef-7b90-4dde-ada1-820233761ec1-kube-api-access-g47ps\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.462493 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-config-data\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.462539 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f87c45ef-7b90-4dde-ada1-820233761ec1-logs\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.462639 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47ps\" (UniqueName: \"kubernetes.io/projected/f87c45ef-7b90-4dde-ada1-820233761ec1-kube-api-access-g47ps\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.463154 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f87c45ef-7b90-4dde-ada1-820233761ec1-logs\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.464836 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-combined-ca-bundle\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.464910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-scripts\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.469301 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-config-data\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.469663 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-combined-ca-bundle\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.470662 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87c45ef-7b90-4dde-ada1-820233761ec1-scripts\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.487646 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47ps\" (UniqueName: \"kubernetes.io/projected/f87c45ef-7b90-4dde-ada1-820233761ec1-kube-api-access-g47ps\") pod \"placement-6f684d7d76-vbzgj\" (UID: \"f87c45ef-7b90-4dde-ada1-820233761ec1\") " pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:20 crc kubenswrapper[4929]: I1002 12:43:20.642251 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:21 crc kubenswrapper[4929]: I1002 12:43:21.080215 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f684d7d76-vbzgj"] Oct 02 12:43:21 crc kubenswrapper[4929]: W1002 12:43:21.083472 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf87c45ef_7b90_4dde_ada1_820233761ec1.slice/crio-4368ff44f4cdb4dce780e93d515aff082e63d30a9fe160f5284a45a4b8c3441e WatchSource:0}: Error finding container 4368ff44f4cdb4dce780e93d515aff082e63d30a9fe160f5284a45a4b8c3441e: Status 404 returned error can't find the container with id 4368ff44f4cdb4dce780e93d515aff082e63d30a9fe160f5284a45a4b8c3441e Oct 02 12:43:21 crc kubenswrapper[4929]: I1002 12:43:21.228870 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f684d7d76-vbzgj" event={"ID":"f87c45ef-7b90-4dde-ada1-820233761ec1","Type":"ContainerStarted","Data":"4368ff44f4cdb4dce780e93d515aff082e63d30a9fe160f5284a45a4b8c3441e"} Oct 02 12:43:21 crc kubenswrapper[4929]: I1002 12:43:21.614051 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:43:21 crc kubenswrapper[4929]: I1002 12:43:21.666354 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:43:21 crc kubenswrapper[4929]: I1002 12:43:21.666673 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="dnsmasq-dns" containerID="cri-o://0eaed74e6a15055d5d32302c5269f13970bc413e2bd1003e855b508594c7ad07" gracePeriod=10 Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.239398 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f684d7d76-vbzgj" event={"ID":"f87c45ef-7b90-4dde-ada1-820233761ec1","Type":"ContainerStarted","Data":"fe24ea10f7cac54f059da87844b3d0d222ed4a7241f7b339bde58322275097cc"} Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.239770 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f684d7d76-vbzgj" event={"ID":"f87c45ef-7b90-4dde-ada1-820233761ec1","Type":"ContainerStarted","Data":"56604c072371cb692d6f26bc103f56d29b40f9c8f50feb8dbd0fd5c18c698a6c"} Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.241918 4929 generic.go:334] "Generic (PLEG): container finished" podID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerID="0eaed74e6a15055d5d32302c5269f13970bc413e2bd1003e855b508594c7ad07" exitCode=0 Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.241977 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" event={"ID":"99ebc735-e9a7-4713-acfb-5fcf1c091db7","Type":"ContainerDied","Data":"0eaed74e6a15055d5d32302c5269f13970bc413e2bd1003e855b508594c7ad07"} Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.685517 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.708758 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87bf\" (UniqueName: \"kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf\") pod \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.708912 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc\") pod \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.709055 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config\") pod \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.709083 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb\") pod \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.709715 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb\") pod \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\" (UID: \"99ebc735-e9a7-4713-acfb-5fcf1c091db7\") " Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.714887 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf" (OuterVolumeSpecName: "kube-api-access-d87bf") pod "99ebc735-e9a7-4713-acfb-5fcf1c091db7" (UID: "99ebc735-e9a7-4713-acfb-5fcf1c091db7"). InnerVolumeSpecName "kube-api-access-d87bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.762583 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config" (OuterVolumeSpecName: "config") pod "99ebc735-e9a7-4713-acfb-5fcf1c091db7" (UID: "99ebc735-e9a7-4713-acfb-5fcf1c091db7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.769664 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99ebc735-e9a7-4713-acfb-5fcf1c091db7" (UID: "99ebc735-e9a7-4713-acfb-5fcf1c091db7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.770747 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99ebc735-e9a7-4713-acfb-5fcf1c091db7" (UID: "99ebc735-e9a7-4713-acfb-5fcf1c091db7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.779793 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99ebc735-e9a7-4713-acfb-5fcf1c091db7" (UID: "99ebc735-e9a7-4713-acfb-5fcf1c091db7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.812913 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.812988 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87bf\" (UniqueName: \"kubernetes.io/projected/99ebc735-e9a7-4713-acfb-5fcf1c091db7-kube-api-access-d87bf\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.813009 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.813020 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:22 crc kubenswrapper[4929]: I1002 12:43:22.813032 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ebc735-e9a7-4713-acfb-5fcf1c091db7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.251613 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.251591 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcd9f45f-ckxz8" event={"ID":"99ebc735-e9a7-4713-acfb-5fcf1c091db7","Type":"ContainerDied","Data":"4961dc2476f049ef6cc591fcee1e0305fd47de86c477c9cffde8ef3d8b2e8f37"} Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.252387 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.252409 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.252436 4929 scope.go:117] "RemoveContainer" containerID="0eaed74e6a15055d5d32302c5269f13970bc413e2bd1003e855b508594c7ad07" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.270423 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f684d7d76-vbzgj" podStartSLOduration=3.270397016 podStartE2EDuration="3.270397016s" podCreationTimestamp="2025-10-02 12:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:43:23.269660215 +0000 UTC m=+5603.820026619" watchObservedRunningTime="2025-10-02 12:43:23.270397016 +0000 UTC m=+5603.820763390" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.274642 4929 scope.go:117] "RemoveContainer" containerID="c9579f1c0623d2330c097ab92fe29af08eb511c5a730495022957d926f40b439" Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.288418 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:43:23 crc kubenswrapper[4929]: I1002 12:43:23.295166 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bcd9f45f-ckxz8"] Oct 02 12:43:24 crc kubenswrapper[4929]: I1002 12:43:24.169066 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" path="/var/lib/kubelet/pods/99ebc735-e9a7-4713-acfb-5fcf1c091db7/volumes" Oct 02 12:43:51 crc kubenswrapper[4929]: I1002 12:43:51.706768 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:43:51 crc kubenswrapper[4929]: I1002 12:43:51.717886 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f684d7d76-vbzgj" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.486110 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9k24b"] Oct 02 12:44:12 crc kubenswrapper[4929]: E1002 12:44:12.488418 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="init" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.488452 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="init" Oct 02 12:44:12 crc kubenswrapper[4929]: E1002 12:44:12.488489 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="dnsmasq-dns" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.488499 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="dnsmasq-dns" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.489270 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ebc735-e9a7-4713-acfb-5fcf1c091db7" containerName="dnsmasq-dns" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.490399 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.519195 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9k24b"] Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.593111 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gpzrh"] Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.594522 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.598721 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gpzrh"] Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.668370 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp9pr\" (UniqueName: \"kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr\") pod \"nova-api-db-create-9k24b\" (UID: \"9ef2c687-95a7-4ca2-a7b7-eb93733b1101\") " pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.771440 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwsl\" (UniqueName: \"kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl\") pod \"nova-cell0-db-create-gpzrh\" (UID: \"2d9ec94a-8101-44c6-a92f-9999dcb58e1a\") " pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.771582 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp9pr\" (UniqueName: \"kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr\") pod \"nova-api-db-create-9k24b\" (UID: \"9ef2c687-95a7-4ca2-a7b7-eb93733b1101\") " pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.784917 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-96nmh"] Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.786441 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.800144 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-96nmh"] Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.819004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp9pr\" (UniqueName: \"kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr\") pod \"nova-api-db-create-9k24b\" (UID: \"9ef2c687-95a7-4ca2-a7b7-eb93733b1101\") " pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.821454 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.872833 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwsl\" (UniqueName: \"kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl\") pod \"nova-cell0-db-create-gpzrh\" (UID: \"2d9ec94a-8101-44c6-a92f-9999dcb58e1a\") " pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.872929 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmzr\" (UniqueName: \"kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr\") pod \"nova-cell1-db-create-96nmh\" (UID: \"08a55624-e5d7-4ee6-b23d-8dffe83d54b3\") " pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.894452 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwsl\" (UniqueName: \"kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl\") pod \"nova-cell0-db-create-gpzrh\" (UID: \"2d9ec94a-8101-44c6-a92f-9999dcb58e1a\") " pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.915885 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.974550 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmzr\" (UniqueName: \"kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr\") pod \"nova-cell1-db-create-96nmh\" (UID: \"08a55624-e5d7-4ee6-b23d-8dffe83d54b3\") " pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:12 crc kubenswrapper[4929]: I1002 12:44:12.995755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmzr\" (UniqueName: \"kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr\") pod \"nova-cell1-db-create-96nmh\" (UID: \"08a55624-e5d7-4ee6-b23d-8dffe83d54b3\") " pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.147073 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.320054 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9k24b"] Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.441698 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gpzrh"] Oct 02 12:44:13 crc kubenswrapper[4929]: W1002 12:44:13.443179 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d9ec94a_8101_44c6_a92f_9999dcb58e1a.slice/crio-3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087 WatchSource:0}: Error finding container 3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087: Status 404 returned error can't find the container with id 3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087 Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.608414 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-96nmh"] Oct 02 12:44:13 crc kubenswrapper[4929]: W1002 12:44:13.612747 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a55624_e5d7_4ee6_b23d_8dffe83d54b3.slice/crio-69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c WatchSource:0}: Error finding container 69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c: Status 404 returned error can't find the container with id 69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.710426 4929 generic.go:334] "Generic (PLEG): container finished" podID="9ef2c687-95a7-4ca2-a7b7-eb93733b1101" containerID="814c660a3434c370a8db2d8f9c12de347b2267020c9a397d9e02fac00651c7bb" exitCode=0 Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.710838 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k24b" event={"ID":"9ef2c687-95a7-4ca2-a7b7-eb93733b1101","Type":"ContainerDied","Data":"814c660a3434c370a8db2d8f9c12de347b2267020c9a397d9e02fac00651c7bb"} Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.710923 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k24b" event={"ID":"9ef2c687-95a7-4ca2-a7b7-eb93733b1101","Type":"ContainerStarted","Data":"f80b8b66aebdc9b4bccc073f8bfdac75e2049bedfb34bb4fabde79d9d404d4fe"} Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.716093 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-96nmh" event={"ID":"08a55624-e5d7-4ee6-b23d-8dffe83d54b3","Type":"ContainerStarted","Data":"69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c"} Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.718654 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gpzrh" event={"ID":"2d9ec94a-8101-44c6-a92f-9999dcb58e1a","Type":"ContainerStarted","Data":"8d7df33ee47a37fe18ac3f14dad21af8031616527d1a176b9a5726dfe6628108"} Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.718719 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gpzrh" event={"ID":"2d9ec94a-8101-44c6-a92f-9999dcb58e1a","Type":"ContainerStarted","Data":"3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087"} Oct 02 12:44:13 crc kubenswrapper[4929]: I1002 12:44:13.745246 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-gpzrh" podStartSLOduration=1.7452230210000002 podStartE2EDuration="1.745223021s" podCreationTimestamp="2025-10-02 12:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:13.743162951 +0000 UTC m=+5654.293529315" watchObservedRunningTime="2025-10-02 12:44:13.745223021 +0000 UTC m=+5654.295589395" Oct 02 12:44:14 crc kubenswrapper[4929]: I1002 12:44:14.735265 4929 generic.go:334] "Generic (PLEG): container finished" podID="08a55624-e5d7-4ee6-b23d-8dffe83d54b3" containerID="27a037976d0d34792eb18d25b699807582081b85779e7560ae87d589a0faa9f0" exitCode=0 Oct 02 12:44:14 crc kubenswrapper[4929]: I1002 12:44:14.735789 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-96nmh" event={"ID":"08a55624-e5d7-4ee6-b23d-8dffe83d54b3","Type":"ContainerDied","Data":"27a037976d0d34792eb18d25b699807582081b85779e7560ae87d589a0faa9f0"} Oct 02 12:44:14 crc kubenswrapper[4929]: I1002 12:44:14.743359 4929 generic.go:334] "Generic (PLEG): container finished" podID="2d9ec94a-8101-44c6-a92f-9999dcb58e1a" containerID="8d7df33ee47a37fe18ac3f14dad21af8031616527d1a176b9a5726dfe6628108" exitCode=0 Oct 02 12:44:14 crc kubenswrapper[4929]: I1002 12:44:14.743723 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gpzrh" event={"ID":"2d9ec94a-8101-44c6-a92f-9999dcb58e1a","Type":"ContainerDied","Data":"8d7df33ee47a37fe18ac3f14dad21af8031616527d1a176b9a5726dfe6628108"} Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.044581 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.117141 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp9pr\" (UniqueName: \"kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr\") pod \"9ef2c687-95a7-4ca2-a7b7-eb93733b1101\" (UID: \"9ef2c687-95a7-4ca2-a7b7-eb93733b1101\") " Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.124839 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr" (OuterVolumeSpecName: "kube-api-access-mp9pr") pod "9ef2c687-95a7-4ca2-a7b7-eb93733b1101" (UID: "9ef2c687-95a7-4ca2-a7b7-eb93733b1101"). InnerVolumeSpecName "kube-api-access-mp9pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.218926 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp9pr\" (UniqueName: \"kubernetes.io/projected/9ef2c687-95a7-4ca2-a7b7-eb93733b1101-kube-api-access-mp9pr\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.753258 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9k24b" event={"ID":"9ef2c687-95a7-4ca2-a7b7-eb93733b1101","Type":"ContainerDied","Data":"f80b8b66aebdc9b4bccc073f8bfdac75e2049bedfb34bb4fabde79d9d404d4fe"} Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.753365 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80b8b66aebdc9b4bccc073f8bfdac75e2049bedfb34bb4fabde79d9d404d4fe" Oct 02 12:44:15 crc kubenswrapper[4929]: I1002 12:44:15.753424 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9k24b" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.122558 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.128631 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.237080 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lmzr\" (UniqueName: \"kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr\") pod \"08a55624-e5d7-4ee6-b23d-8dffe83d54b3\" (UID: \"08a55624-e5d7-4ee6-b23d-8dffe83d54b3\") " Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.237213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jwsl\" (UniqueName: \"kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl\") pod \"2d9ec94a-8101-44c6-a92f-9999dcb58e1a\" (UID: \"2d9ec94a-8101-44c6-a92f-9999dcb58e1a\") " Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.240341 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr" (OuterVolumeSpecName: "kube-api-access-2lmzr") pod "08a55624-e5d7-4ee6-b23d-8dffe83d54b3" (UID: "08a55624-e5d7-4ee6-b23d-8dffe83d54b3"). InnerVolumeSpecName "kube-api-access-2lmzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.240564 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl" (OuterVolumeSpecName: "kube-api-access-2jwsl") pod "2d9ec94a-8101-44c6-a92f-9999dcb58e1a" (UID: "2d9ec94a-8101-44c6-a92f-9999dcb58e1a"). InnerVolumeSpecName "kube-api-access-2jwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.300875 4929 scope.go:117] "RemoveContainer" containerID="b738598602d618870e97f8380351e0ec26acef0127740d0fda55c76890576d9b" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.339325 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lmzr\" (UniqueName: \"kubernetes.io/projected/08a55624-e5d7-4ee6-b23d-8dffe83d54b3-kube-api-access-2lmzr\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.339359 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jwsl\" (UniqueName: \"kubernetes.io/projected/2d9ec94a-8101-44c6-a92f-9999dcb58e1a-kube-api-access-2jwsl\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.762640 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-96nmh" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.762639 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-96nmh" event={"ID":"08a55624-e5d7-4ee6-b23d-8dffe83d54b3","Type":"ContainerDied","Data":"69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c"} Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.762758 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69ab70f380fd19c586472b940451feec6a34a4c1a50d5b05fcb01aae23c8c85c" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.764157 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gpzrh" Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.764145 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gpzrh" event={"ID":"2d9ec94a-8101-44c6-a92f-9999dcb58e1a","Type":"ContainerDied","Data":"3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087"} Oct 02 12:44:16 crc kubenswrapper[4929]: I1002 12:44:16.764286 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3146afc06fc9970131c64075d86e190c9790a4b5b0e8f4d4c7bc93603fcd6087" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.726235 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4baa-account-create-xxb26"] Oct 02 12:44:22 crc kubenswrapper[4929]: E1002 12:44:22.727052 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef2c687-95a7-4ca2-a7b7-eb93733b1101" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727066 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef2c687-95a7-4ca2-a7b7-eb93733b1101" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: E1002 12:44:22.727082 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d9ec94a-8101-44c6-a92f-9999dcb58e1a" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727089 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d9ec94a-8101-44c6-a92f-9999dcb58e1a" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: E1002 12:44:22.727100 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a55624-e5d7-4ee6-b23d-8dffe83d54b3" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727106 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a55624-e5d7-4ee6-b23d-8dffe83d54b3" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727289 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d9ec94a-8101-44c6-a92f-9999dcb58e1a" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727303 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a55624-e5d7-4ee6-b23d-8dffe83d54b3" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727315 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef2c687-95a7-4ca2-a7b7-eb93733b1101" containerName="mariadb-database-create" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.727855 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.729463 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.737575 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4baa-account-create-xxb26"] Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.869385 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt8px\" (UniqueName: \"kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px\") pod \"nova-api-4baa-account-create-xxb26\" (UID: \"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa\") " pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.914654 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2fa9-account-create-6rgp4"] Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.915929 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.917729 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.925101 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2fa9-account-create-6rgp4"] Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.971254 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt8px\" (UniqueName: \"kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px\") pod \"nova-api-4baa-account-create-xxb26\" (UID: \"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa\") " pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:22 crc kubenswrapper[4929]: I1002 12:44:22.995023 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt8px\" (UniqueName: \"kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px\") pod \"nova-api-4baa-account-create-xxb26\" (UID: \"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa\") " pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.052876 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.073328 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xblf7\" (UniqueName: \"kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7\") pod \"nova-cell0-2fa9-account-create-6rgp4\" (UID: \"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae\") " pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.127162 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-94f9-account-create-zjvn2"] Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.128801 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.132386 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.134582 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94f9-account-create-zjvn2"] Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.174824 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xblf7\" (UniqueName: \"kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7\") pod \"nova-cell0-2fa9-account-create-6rgp4\" (UID: \"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae\") " pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.204688 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xblf7\" (UniqueName: \"kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7\") pod \"nova-cell0-2fa9-account-create-6rgp4\" (UID: \"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae\") " pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.239523 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.275942 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnrh\" (UniqueName: \"kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh\") pod \"nova-cell1-94f9-account-create-zjvn2\" (UID: \"056a2f7c-05d0-413b-9fdd-52493121b1f4\") " pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.378260 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnrh\" (UniqueName: \"kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh\") pod \"nova-cell1-94f9-account-create-zjvn2\" (UID: \"056a2f7c-05d0-413b-9fdd-52493121b1f4\") " pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.397774 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnrh\" (UniqueName: \"kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh\") pod \"nova-cell1-94f9-account-create-zjvn2\" (UID: \"056a2f7c-05d0-413b-9fdd-52493121b1f4\") " pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.489674 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.523713 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4baa-account-create-xxb26"] Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.647542 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2fa9-account-create-6rgp4"] Oct 02 12:44:23 crc kubenswrapper[4929]: W1002 12:44:23.657383 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0714e7b6_c0b5_419f_b517_d71ae7f2a6ae.slice/crio-6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a WatchSource:0}: Error finding container 6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a: Status 404 returned error can't find the container with id 6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.828875 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4baa-account-create-xxb26" event={"ID":"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa","Type":"ContainerStarted","Data":"c3bb585c8a289d60b72eab1ded792bcece5a7c86bce449978a083b6ad756b47f"} Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.828933 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4baa-account-create-xxb26" event={"ID":"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa","Type":"ContainerStarted","Data":"68d43e8f9c10ac67f845d418120c68c4f2d1cab2da6d992705a73096023819d1"} Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.830196 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" event={"ID":"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae","Type":"ContainerStarted","Data":"6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a"} Oct 02 12:44:23 crc kubenswrapper[4929]: I1002 12:44:23.912365 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94f9-account-create-zjvn2"] Oct 02 12:44:23 crc kubenswrapper[4929]: W1002 12:44:23.912758 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod056a2f7c_05d0_413b_9fdd_52493121b1f4.slice/crio-9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9 WatchSource:0}: Error finding container 9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9: Status 404 returned error can't find the container with id 9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9 Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.844420 4929 generic.go:334] "Generic (PLEG): container finished" podID="f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" containerID="c3bb585c8a289d60b72eab1ded792bcece5a7c86bce449978a083b6ad756b47f" exitCode=0 Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.844784 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4baa-account-create-xxb26" event={"ID":"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa","Type":"ContainerDied","Data":"c3bb585c8a289d60b72eab1ded792bcece5a7c86bce449978a083b6ad756b47f"} Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.847589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" event={"ID":"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae","Type":"ContainerStarted","Data":"7609d508a51d8c5f8eb4128611d34cbc3442a69f30a4f7a4b4d17a52f46926c0"} Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.849481 4929 generic.go:334] "Generic (PLEG): container finished" podID="056a2f7c-05d0-413b-9fdd-52493121b1f4" containerID="8fe17b090a5d90d1f7cc5553174538ccac24094b91760141b50d2ee4fe9ffa42" exitCode=0 Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.849530 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f9-account-create-zjvn2" event={"ID":"056a2f7c-05d0-413b-9fdd-52493121b1f4","Type":"ContainerDied","Data":"8fe17b090a5d90d1f7cc5553174538ccac24094b91760141b50d2ee4fe9ffa42"} Oct 02 12:44:24 crc kubenswrapper[4929]: I1002 12:44:24.849556 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f9-account-create-zjvn2" event={"ID":"056a2f7c-05d0-413b-9fdd-52493121b1f4","Type":"ContainerStarted","Data":"9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9"} Oct 02 12:44:25 crc kubenswrapper[4929]: I1002 12:44:25.863440 4929 generic.go:334] "Generic (PLEG): container finished" podID="0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" containerID="7609d508a51d8c5f8eb4128611d34cbc3442a69f30a4f7a4b4d17a52f46926c0" exitCode=0 Oct 02 12:44:25 crc kubenswrapper[4929]: I1002 12:44:25.863581 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" event={"ID":"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae","Type":"ContainerDied","Data":"7609d508a51d8c5f8eb4128611d34cbc3442a69f30a4f7a4b4d17a52f46926c0"} Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.313127 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.332419 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.342879 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.437838 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt8px\" (UniqueName: \"kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px\") pod \"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa\" (UID: \"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa\") " Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.438203 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgnrh\" (UniqueName: \"kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh\") pod \"056a2f7c-05d0-413b-9fdd-52493121b1f4\" (UID: \"056a2f7c-05d0-413b-9fdd-52493121b1f4\") " Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.438377 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xblf7\" (UniqueName: \"kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7\") pod \"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae\" (UID: \"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae\") " Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.444814 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px" (OuterVolumeSpecName: "kube-api-access-lt8px") pod "f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" (UID: "f3644ee3-4a0a-4623-bf6d-c1b8302e8baa"). InnerVolumeSpecName "kube-api-access-lt8px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.444859 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh" (OuterVolumeSpecName: "kube-api-access-mgnrh") pod "056a2f7c-05d0-413b-9fdd-52493121b1f4" (UID: "056a2f7c-05d0-413b-9fdd-52493121b1f4"). InnerVolumeSpecName "kube-api-access-mgnrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.445054 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7" (OuterVolumeSpecName: "kube-api-access-xblf7") pod "0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" (UID: "0714e7b6-c0b5-419f-b517-d71ae7f2a6ae"). InnerVolumeSpecName "kube-api-access-xblf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.541004 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgnrh\" (UniqueName: \"kubernetes.io/projected/056a2f7c-05d0-413b-9fdd-52493121b1f4-kube-api-access-mgnrh\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.541046 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xblf7\" (UniqueName: \"kubernetes.io/projected/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae-kube-api-access-xblf7\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.541059 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt8px\" (UniqueName: \"kubernetes.io/projected/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa-kube-api-access-lt8px\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.876212 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f9-account-create-zjvn2" event={"ID":"056a2f7c-05d0-413b-9fdd-52493121b1f4","Type":"ContainerDied","Data":"9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9"} Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.876250 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f9-account-create-zjvn2" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.876274 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9941a0f49bac05d8aea3ba4ffb4ad51ca0409c6a7579c7123a3d2085233d06e9" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.877891 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4baa-account-create-xxb26" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.877886 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4baa-account-create-xxb26" event={"ID":"f3644ee3-4a0a-4623-bf6d-c1b8302e8baa","Type":"ContainerDied","Data":"68d43e8f9c10ac67f845d418120c68c4f2d1cab2da6d992705a73096023819d1"} Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.878025 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d43e8f9c10ac67f845d418120c68c4f2d1cab2da6d992705a73096023819d1" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.880159 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" event={"ID":"0714e7b6-c0b5-419f-b517-d71ae7f2a6ae","Type":"ContainerDied","Data":"6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a"} Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.880194 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6faa045f1374dee9b913e8b504babc529221dd05d63223dc893a445ccdc3914a" Oct 02 12:44:26 crc kubenswrapper[4929]: I1002 12:44:26.880199 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2fa9-account-create-6rgp4" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.171348 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gpx7r"] Oct 02 12:44:28 crc kubenswrapper[4929]: E1002 12:44:28.171838 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.171849 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: E1002 12:44:28.171873 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056a2f7c-05d0-413b-9fdd-52493121b1f4" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.171879 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="056a2f7c-05d0-413b-9fdd-52493121b1f4" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: E1002 12:44:28.171889 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.171895 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.172077 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.172087 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="056a2f7c-05d0-413b-9fdd-52493121b1f4" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.172101 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" containerName="mariadb-account-create" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.172682 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.174229 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9qsrr" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.175231 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.176207 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gpx7r"] Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.181163 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.271896 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5j6f\" (UniqueName: \"kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.272020 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.272104 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.272225 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.373814 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5j6f\" (UniqueName: \"kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.374240 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.374922 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.375006 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.379047 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.379172 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.387615 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.389319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5j6f\" (UniqueName: \"kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f\") pod \"nova-cell0-conductor-db-sync-gpx7r\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.488781 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:28 crc kubenswrapper[4929]: I1002 12:44:28.961929 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gpx7r"] Oct 02 12:44:28 crc kubenswrapper[4929]: W1002 12:44:28.967737 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbd7de5b_4bbd_42a0_b032_6177901cab75.slice/crio-34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af WatchSource:0}: Error finding container 34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af: Status 404 returned error can't find the container with id 34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af Oct 02 12:44:29 crc kubenswrapper[4929]: I1002 12:44:29.907796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" event={"ID":"cbd7de5b-4bbd-42a0-b032-6177901cab75","Type":"ContainerStarted","Data":"dbdd91ae8e0158aaeab352d333673767a30d5b08904d28e9bc076b14dd10e2f9"} Oct 02 12:44:29 crc kubenswrapper[4929]: I1002 12:44:29.908139 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" event={"ID":"cbd7de5b-4bbd-42a0-b032-6177901cab75","Type":"ContainerStarted","Data":"34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af"} Oct 02 12:44:29 crc kubenswrapper[4929]: I1002 12:44:29.924779 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" podStartSLOduration=1.924763708 podStartE2EDuration="1.924763708s" podCreationTimestamp="2025-10-02 12:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:29.92381067 +0000 UTC m=+5670.474177054" watchObservedRunningTime="2025-10-02 12:44:29.924763708 +0000 UTC m=+5670.475130072" Oct 02 12:44:34 crc kubenswrapper[4929]: I1002 12:44:34.961766 4929 generic.go:334] "Generic (PLEG): container finished" podID="cbd7de5b-4bbd-42a0-b032-6177901cab75" containerID="dbdd91ae8e0158aaeab352d333673767a30d5b08904d28e9bc076b14dd10e2f9" exitCode=0 Oct 02 12:44:34 crc kubenswrapper[4929]: I1002 12:44:34.961868 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" event={"ID":"cbd7de5b-4bbd-42a0-b032-6177901cab75","Type":"ContainerDied","Data":"dbdd91ae8e0158aaeab352d333673767a30d5b08904d28e9bc076b14dd10e2f9"} Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.289466 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.331766 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data\") pod \"cbd7de5b-4bbd-42a0-b032-6177901cab75\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.332210 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle\") pod \"cbd7de5b-4bbd-42a0-b032-6177901cab75\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.332333 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5j6f\" (UniqueName: \"kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f\") pod \"cbd7de5b-4bbd-42a0-b032-6177901cab75\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.332449 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts\") pod \"cbd7de5b-4bbd-42a0-b032-6177901cab75\" (UID: \"cbd7de5b-4bbd-42a0-b032-6177901cab75\") " Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.339452 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f" (OuterVolumeSpecName: "kube-api-access-b5j6f") pod "cbd7de5b-4bbd-42a0-b032-6177901cab75" (UID: "cbd7de5b-4bbd-42a0-b032-6177901cab75"). InnerVolumeSpecName "kube-api-access-b5j6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.341082 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts" (OuterVolumeSpecName: "scripts") pod "cbd7de5b-4bbd-42a0-b032-6177901cab75" (UID: "cbd7de5b-4bbd-42a0-b032-6177901cab75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.364452 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data" (OuterVolumeSpecName: "config-data") pod "cbd7de5b-4bbd-42a0-b032-6177901cab75" (UID: "cbd7de5b-4bbd-42a0-b032-6177901cab75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.372174 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd7de5b-4bbd-42a0-b032-6177901cab75" (UID: "cbd7de5b-4bbd-42a0-b032-6177901cab75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.434235 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.434270 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5j6f\" (UniqueName: \"kubernetes.io/projected/cbd7de5b-4bbd-42a0-b032-6177901cab75-kube-api-access-b5j6f\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.434283 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.434292 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd7de5b-4bbd-42a0-b032-6177901cab75-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.983848 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" event={"ID":"cbd7de5b-4bbd-42a0-b032-6177901cab75","Type":"ContainerDied","Data":"34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af"} Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.983889 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34de50094c5c25432bf95b80497d92be651eb5d7cae858cd1031c5e7f25383af" Oct 02 12:44:36 crc kubenswrapper[4929]: I1002 12:44:36.984010 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gpx7r" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.100150 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:44:37 crc kubenswrapper[4929]: E1002 12:44:37.100616 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd7de5b-4bbd-42a0-b032-6177901cab75" containerName="nova-cell0-conductor-db-sync" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.100640 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd7de5b-4bbd-42a0-b032-6177901cab75" containerName="nova-cell0-conductor-db-sync" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.100912 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd7de5b-4bbd-42a0-b032-6177901cab75" containerName="nova-cell0-conductor-db-sync" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.101652 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.103724 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9qsrr" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.103820 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.123884 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.145443 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.145496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.145596 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572fb\" (UniqueName: \"kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.247324 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.247371 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.247457 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572fb\" (UniqueName: \"kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.250530 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.256580 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.275266 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572fb\" (UniqueName: \"kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb\") pod \"nova-cell0-conductor-0\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.427524 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.917108 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:44:37 crc kubenswrapper[4929]: I1002 12:44:37.992182 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1fbd7f42-08ae-4572-96a1-0b74c7a63866","Type":"ContainerStarted","Data":"c60eed098c69f08c70d7bd502267a34fc6203bbf01f6a976c9b377990ea97033"} Oct 02 12:44:39 crc kubenswrapper[4929]: I1002 12:44:39.002798 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1fbd7f42-08ae-4572-96a1-0b74c7a63866","Type":"ContainerStarted","Data":"c9374fb251f110b66c563d08602803d3c0a1fde424b53358990a6636c8d1e2f9"} Oct 02 12:44:39 crc kubenswrapper[4929]: I1002 12:44:39.004284 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:39 crc kubenswrapper[4929]: I1002 12:44:39.026112 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.026090542 podStartE2EDuration="2.026090542s" podCreationTimestamp="2025-10-02 12:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:39.020801489 +0000 UTC m=+5679.571167863" watchObservedRunningTime="2025-10-02 12:44:39.026090542 +0000 UTC m=+5679.576456926" Oct 02 12:44:47 crc kubenswrapper[4929]: I1002 12:44:47.459486 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 12:44:47 crc kubenswrapper[4929]: I1002 12:44:47.991093 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xn629"] Oct 02 12:44:47 crc kubenswrapper[4929]: I1002 12:44:47.993205 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.002309 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.002917 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.004626 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xn629"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.044610 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.044912 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.045033 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.045186 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmnhf\" (UniqueName: \"kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.143371 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.144907 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.146729 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.146976 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.147104 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.147247 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.147348 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zw7\" (UniqueName: \"kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.147450 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmnhf\" (UniqueName: \"kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.147576 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.153877 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.155445 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.161237 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.162563 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.186660 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.189061 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmnhf\" (UniqueName: \"kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf\") pod \"nova-cell0-cell-mapping-xn629\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.232084 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.234335 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.237109 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251170 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251245 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251319 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251345 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zw7\" (UniqueName: \"kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251374 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251404 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.251436 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrhc\" (UniqueName: \"kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.257392 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.259326 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.276932 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.289642 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.291725 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.302761 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zw7\" (UniqueName: \"kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7\") pod \"nova-cell1-novncproxy-0\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.303198 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.323582 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.343153 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.353549 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.353646 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.353683 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.353709 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrhc\" (UniqueName: \"kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.354476 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.358075 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.359534 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.363006 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.363533 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.380472 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.389757 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.391362 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.392633 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrhc\" (UniqueName: \"kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc\") pod \"nova-metadata-0\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.402112 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.410921 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.456906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458317 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2cg9\" (UniqueName: \"kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458468 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458514 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458548 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458608 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458693 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68qn\" (UniqueName: \"kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458711 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458733 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458746 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458762 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.458789 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncf8\" (UniqueName: \"kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.490701 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.547202 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559378 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559458 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559497 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68qn\" (UniqueName: \"kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559514 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559534 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559551 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559571 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncf8\" (UniqueName: \"kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559618 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559636 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2cg9\" (UniqueName: \"kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.559675 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.560092 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.561948 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.562220 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.563176 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.564473 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.566359 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.567421 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.575362 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.578563 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.588234 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68qn\" (UniqueName: \"kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn\") pod \"nova-scheduler-0\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.590501 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncf8\" (UniqueName: \"kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8\") pod \"dnsmasq-dns-5cc869bb7-k6d6j\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.596832 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2cg9\" (UniqueName: \"kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9\") pod \"nova-api-0\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.723019 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.746480 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.748449 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.851387 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xn629"] Oct 02 12:44:48 crc kubenswrapper[4929]: I1002 12:44:48.997620 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.094184 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.095937 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xn629" event={"ID":"abf080ed-d0fb-4e55-8a95-11964070b99a","Type":"ContainerStarted","Data":"9314bd4d7241c1c7b5af9b85720a3b99828b095da024bfb64f07932e80fcecdd"} Oct 02 12:44:49 crc kubenswrapper[4929]: W1002 12:44:49.097021 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cd9082_d45d_4842_8c60_b75330997f59.slice/crio-6e215cb18e45ccdd5b5dd555e72aad56b9a0a38eab3d5de3eac83b50ad85893f WatchSource:0}: Error finding container 6e215cb18e45ccdd5b5dd555e72aad56b9a0a38eab3d5de3eac83b50ad85893f: Status 404 returned error can't find the container with id 6e215cb18e45ccdd5b5dd555e72aad56b9a0a38eab3d5de3eac83b50ad85893f Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.098021 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerStarted","Data":"a0f7a3c2bdef840969f50dc4c22f050e71b195dc45dcdf75c9ffe55caa02a8ec"} Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.309908 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.315202 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:49 crc kubenswrapper[4929]: W1002 12:44:49.317485 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2baa029c_ae32_4ba7_adc5_0db1f3dd13ff.slice/crio-4569e097674b5d728ab4cf472881a62d96b461eb4b135c1f7d60c61d9487cdd4 WatchSource:0}: Error finding container 4569e097674b5d728ab4cf472881a62d96b461eb4b135c1f7d60c61d9487cdd4: Status 404 returned error can't find the container with id 4569e097674b5d728ab4cf472881a62d96b461eb4b135c1f7d60c61d9487cdd4 Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.411867 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.430420 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tp2vw"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.434624 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.436937 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.437393 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.446394 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tp2vw"] Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.482789 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.482846 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5q7n\" (UniqueName: \"kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.482892 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.482943 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.585161 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.585253 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5q7n\" (UniqueName: \"kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.585308 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.585418 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.588524 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.589879 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.615074 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.617317 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5q7n\" (UniqueName: \"kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n\") pod \"nova-cell1-conductor-db-sync-tp2vw\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:49 crc kubenswrapper[4929]: I1002 12:44:49.773555 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.109584 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff","Type":"ContainerStarted","Data":"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.109647 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff","Type":"ContainerStarted","Data":"4569e097674b5d728ab4cf472881a62d96b461eb4b135c1f7d60c61d9487cdd4"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.111662 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerStarted","Data":"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.111762 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerStarted","Data":"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.113066 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5cd9082-d45d-4842-8c60-b75330997f59","Type":"ContainerStarted","Data":"964670ac51fc2bc9a5c92e8b605ecb92c96615496d64726fcd06a56bc90de38c"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.113110 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5cd9082-d45d-4842-8c60-b75330997f59","Type":"ContainerStarted","Data":"6e215cb18e45ccdd5b5dd555e72aad56b9a0a38eab3d5de3eac83b50ad85893f"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.114594 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xn629" event={"ID":"abf080ed-d0fb-4e55-8a95-11964070b99a","Type":"ContainerStarted","Data":"c8e3678828060ab54ec90381b477ddf8e2c2d851b350911e16caad3cc7f2449a"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.116770 4929 generic.go:334] "Generic (PLEG): container finished" podID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerID="00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752" exitCode=0 Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.116840 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" event={"ID":"2257267d-e6ea-45bb-b3c4-a30267fc9147","Type":"ContainerDied","Data":"00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.116860 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" event={"ID":"2257267d-e6ea-45bb-b3c4-a30267fc9147","Type":"ContainerStarted","Data":"c9fd5ad1d79f3a9e29f5346b4143f50aedef6ba82a40de408527753345b931de"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.119298 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerStarted","Data":"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.119326 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerStarted","Data":"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.119336 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerStarted","Data":"9d6a549318322e5c815c066eb4bc8074321e71e7e60dd972a7594fdcdbf1a0bb"} Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.130303 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.130286179 podStartE2EDuration="2.130286179s" podCreationTimestamp="2025-10-02 12:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:50.125791219 +0000 UTC m=+5690.676157593" watchObservedRunningTime="2025-10-02 12:44:50.130286179 +0000 UTC m=+5690.680652543" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.166976 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.166931326 podStartE2EDuration="2.166931326s" podCreationTimestamp="2025-10-02 12:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:50.143010806 +0000 UTC m=+5690.693377170" watchObservedRunningTime="2025-10-02 12:44:50.166931326 +0000 UTC m=+5690.717297690" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.217652 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.217560226 podStartE2EDuration="2.217560226s" podCreationTimestamp="2025-10-02 12:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:50.186169411 +0000 UTC m=+5690.736535775" watchObservedRunningTime="2025-10-02 12:44:50.217560226 +0000 UTC m=+5690.767926590" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.240741 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.240718934 podStartE2EDuration="2.240718934s" podCreationTimestamp="2025-10-02 12:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:50.212340086 +0000 UTC m=+5690.762706450" watchObservedRunningTime="2025-10-02 12:44:50.240718934 +0000 UTC m=+5690.791085298" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.269461 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xn629" podStartSLOduration=3.269412772 podStartE2EDuration="3.269412772s" podCreationTimestamp="2025-10-02 12:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:50.251095573 +0000 UTC m=+5690.801461927" watchObservedRunningTime="2025-10-02 12:44:50.269412772 +0000 UTC m=+5690.819779136" Oct 02 12:44:50 crc kubenswrapper[4929]: I1002 12:44:50.290136 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tp2vw"] Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.131712 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" event={"ID":"2257267d-e6ea-45bb-b3c4-a30267fc9147","Type":"ContainerStarted","Data":"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5"} Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.131883 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.135913 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" event={"ID":"3c4613a7-f848-47e6-9ada-737b3de390d9","Type":"ContainerStarted","Data":"33b43bf7f91b8e5ef8dd5aed1fad0674b63494091738c29da35c12a0d0620d1a"} Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.135974 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" event={"ID":"3c4613a7-f848-47e6-9ada-737b3de390d9","Type":"ContainerStarted","Data":"61c706fccab277a26a7bf8bd0ff78d5da93e74693b2b918f2e7a01e562b62a20"} Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.208236 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" podStartSLOduration=3.208211568 podStartE2EDuration="3.208211568s" podCreationTimestamp="2025-10-02 12:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:51.184290398 +0000 UTC m=+5691.734656762" watchObservedRunningTime="2025-10-02 12:44:51.208211568 +0000 UTC m=+5691.758577932" Oct 02 12:44:51 crc kubenswrapper[4929]: I1002 12:44:51.213780 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" podStartSLOduration=2.213757418 podStartE2EDuration="2.213757418s" podCreationTimestamp="2025-10-02 12:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:51.203825402 +0000 UTC m=+5691.754191756" watchObservedRunningTime="2025-10-02 12:44:51.213757418 +0000 UTC m=+5691.764123792" Oct 02 12:44:53 crc kubenswrapper[4929]: I1002 12:44:53.403055 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:44:53 crc kubenswrapper[4929]: I1002 12:44:53.403649 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:44:53 crc kubenswrapper[4929]: I1002 12:44:53.547411 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:53 crc kubenswrapper[4929]: I1002 12:44:53.747337 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:44:54 crc kubenswrapper[4929]: I1002 12:44:54.168182 4929 generic.go:334] "Generic (PLEG): container finished" podID="3c4613a7-f848-47e6-9ada-737b3de390d9" containerID="33b43bf7f91b8e5ef8dd5aed1fad0674b63494091738c29da35c12a0d0620d1a" exitCode=0 Oct 02 12:44:54 crc kubenswrapper[4929]: I1002 12:44:54.168323 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" event={"ID":"3c4613a7-f848-47e6-9ada-737b3de390d9","Type":"ContainerDied","Data":"33b43bf7f91b8e5ef8dd5aed1fad0674b63494091738c29da35c12a0d0620d1a"} Oct 02 12:44:54 crc kubenswrapper[4929]: I1002 12:44:54.170392 4929 generic.go:334] "Generic (PLEG): container finished" podID="abf080ed-d0fb-4e55-8a95-11964070b99a" containerID="c8e3678828060ab54ec90381b477ddf8e2c2d851b350911e16caad3cc7f2449a" exitCode=0 Oct 02 12:44:54 crc kubenswrapper[4929]: I1002 12:44:54.170419 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xn629" event={"ID":"abf080ed-d0fb-4e55-8a95-11964070b99a","Type":"ContainerDied","Data":"c8e3678828060ab54ec90381b477ddf8e2c2d851b350911e16caad3cc7f2449a"} Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.755343 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.762912 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920680 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data\") pod \"3c4613a7-f848-47e6-9ada-737b3de390d9\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920735 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle\") pod \"3c4613a7-f848-47e6-9ada-737b3de390d9\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920796 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5q7n\" (UniqueName: \"kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n\") pod \"3c4613a7-f848-47e6-9ada-737b3de390d9\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920819 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts\") pod \"abf080ed-d0fb-4e55-8a95-11964070b99a\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920871 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts\") pod \"3c4613a7-f848-47e6-9ada-737b3de390d9\" (UID: \"3c4613a7-f848-47e6-9ada-737b3de390d9\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920944 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle\") pod \"abf080ed-d0fb-4e55-8a95-11964070b99a\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.920978 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data\") pod \"abf080ed-d0fb-4e55-8a95-11964070b99a\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.921090 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmnhf\" (UniqueName: \"kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf\") pod \"abf080ed-d0fb-4e55-8a95-11964070b99a\" (UID: \"abf080ed-d0fb-4e55-8a95-11964070b99a\") " Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.926647 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts" (OuterVolumeSpecName: "scripts") pod "3c4613a7-f848-47e6-9ada-737b3de390d9" (UID: "3c4613a7-f848-47e6-9ada-737b3de390d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.926692 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n" (OuterVolumeSpecName: "kube-api-access-q5q7n") pod "3c4613a7-f848-47e6-9ada-737b3de390d9" (UID: "3c4613a7-f848-47e6-9ada-737b3de390d9"). InnerVolumeSpecName "kube-api-access-q5q7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.940491 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf" (OuterVolumeSpecName: "kube-api-access-rmnhf") pod "abf080ed-d0fb-4e55-8a95-11964070b99a" (UID: "abf080ed-d0fb-4e55-8a95-11964070b99a"). InnerVolumeSpecName "kube-api-access-rmnhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.940635 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts" (OuterVolumeSpecName: "scripts") pod "abf080ed-d0fb-4e55-8a95-11964070b99a" (UID: "abf080ed-d0fb-4e55-8a95-11964070b99a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.947611 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c4613a7-f848-47e6-9ada-737b3de390d9" (UID: "3c4613a7-f848-47e6-9ada-737b3de390d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.948370 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data" (OuterVolumeSpecName: "config-data") pod "3c4613a7-f848-47e6-9ada-737b3de390d9" (UID: "3c4613a7-f848-47e6-9ada-737b3de390d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.949895 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data" (OuterVolumeSpecName: "config-data") pod "abf080ed-d0fb-4e55-8a95-11964070b99a" (UID: "abf080ed-d0fb-4e55-8a95-11964070b99a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:55 crc kubenswrapper[4929]: I1002 12:44:55.952244 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abf080ed-d0fb-4e55-8a95-11964070b99a" (UID: "abf080ed-d0fb-4e55-8a95-11964070b99a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023406 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023442 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023460 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023475 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmnhf\" (UniqueName: \"kubernetes.io/projected/abf080ed-d0fb-4e55-8a95-11964070b99a-kube-api-access-rmnhf\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023485 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023496 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c4613a7-f848-47e6-9ada-737b3de390d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023506 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5q7n\" (UniqueName: \"kubernetes.io/projected/3c4613a7-f848-47e6-9ada-737b3de390d9-kube-api-access-q5q7n\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.023518 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abf080ed-d0fb-4e55-8a95-11964070b99a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.190994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xn629" event={"ID":"abf080ed-d0fb-4e55-8a95-11964070b99a","Type":"ContainerDied","Data":"9314bd4d7241c1c7b5af9b85720a3b99828b095da024bfb64f07932e80fcecdd"} Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.191037 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9314bd4d7241c1c7b5af9b85720a3b99828b095da024bfb64f07932e80fcecdd" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.191073 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xn629" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.193713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" event={"ID":"3c4613a7-f848-47e6-9ada-737b3de390d9","Type":"ContainerDied","Data":"61c706fccab277a26a7bf8bd0ff78d5da93e74693b2b918f2e7a01e562b62a20"} Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.193747 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61c706fccab277a26a7bf8bd0ff78d5da93e74693b2b918f2e7a01e562b62a20" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.193787 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tp2vw" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.291234 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:44:56 crc kubenswrapper[4929]: E1002 12:44:56.291609 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf080ed-d0fb-4e55-8a95-11964070b99a" containerName="nova-manage" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.291625 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf080ed-d0fb-4e55-8a95-11964070b99a" containerName="nova-manage" Oct 02 12:44:56 crc kubenswrapper[4929]: E1002 12:44:56.291667 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c4613a7-f848-47e6-9ada-737b3de390d9" containerName="nova-cell1-conductor-db-sync" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.291674 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c4613a7-f848-47e6-9ada-737b3de390d9" containerName="nova-cell1-conductor-db-sync" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.291822 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf080ed-d0fb-4e55-8a95-11964070b99a" containerName="nova-manage" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.291840 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c4613a7-f848-47e6-9ada-737b3de390d9" containerName="nova-cell1-conductor-db-sync" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.292509 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.295003 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.300370 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.387130 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.387373 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-log" containerID="cri-o://3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" gracePeriod=30 Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.387495 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-api" containerID="cri-o://d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" gracePeriod=30 Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.427754 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.428015 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" containerName="nova-scheduler-scheduler" containerID="cri-o://35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a" gracePeriod=30 Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.430870 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.430933 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.431030 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmv8\" (UniqueName: \"kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.439255 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.439535 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-log" containerID="cri-o://0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" gracePeriod=30 Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.439610 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-metadata" containerID="cri-o://08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" gracePeriod=30 Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.533397 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.533455 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.533503 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmv8\" (UniqueName: \"kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.539033 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.539833 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.552738 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmv8\" (UniqueName: \"kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8\") pod \"nova-cell1-conductor-0\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:56 crc kubenswrapper[4929]: I1002 12:44:56.608194 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.143767 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.152324 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214542 4929 generic.go:334] "Generic (PLEG): container finished" podID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerID="08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" exitCode=0 Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214590 4929 generic.go:334] "Generic (PLEG): container finished" podID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerID="0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" exitCode=143 Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214647 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214724 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerDied","Data":"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214768 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerDied","Data":"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214779 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4675fa0b-da83-4c1e-b083-76c5901ef9d7","Type":"ContainerDied","Data":"a0f7a3c2bdef840969f50dc4c22f050e71b195dc45dcdf75c9ffe55caa02a8ec"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.214800 4929 scope.go:117] "RemoveContainer" containerID="08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220101 4929 generic.go:334] "Generic (PLEG): container finished" podID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerID="d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" exitCode=0 Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220129 4929 generic.go:334] "Generic (PLEG): container finished" podID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerID="3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" exitCode=143 Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220183 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerDied","Data":"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220218 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerDied","Data":"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a6fd99b-88d0-4818-8fb7-b9ba64097277","Type":"ContainerDied","Data":"9d6a549318322e5c815c066eb4bc8074321e71e7e60dd972a7594fdcdbf1a0bb"} Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.220311 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.245697 4929 scope.go:117] "RemoveContainer" containerID="0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.253809 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258182 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2cg9\" (UniqueName: \"kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9\") pod \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258264 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs\") pod \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258361 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfrhc\" (UniqueName: \"kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc\") pod \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258406 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data\") pod \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258473 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs\") pod \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258560 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle\") pod \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\" (UID: \"9a6fd99b-88d0-4818-8fb7-b9ba64097277\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258683 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data\") pod \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.258722 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle\") pod \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\" (UID: \"4675fa0b-da83-4c1e-b083-76c5901ef9d7\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.262773 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs" (OuterVolumeSpecName: "logs") pod "4675fa0b-da83-4c1e-b083-76c5901ef9d7" (UID: "4675fa0b-da83-4c1e-b083-76c5901ef9d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.262990 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs" (OuterVolumeSpecName: "logs") pod "9a6fd99b-88d0-4818-8fb7-b9ba64097277" (UID: "9a6fd99b-88d0-4818-8fb7-b9ba64097277"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.266351 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9" (OuterVolumeSpecName: "kube-api-access-t2cg9") pod "9a6fd99b-88d0-4818-8fb7-b9ba64097277" (UID: "9a6fd99b-88d0-4818-8fb7-b9ba64097277"). InnerVolumeSpecName "kube-api-access-t2cg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.276487 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc" (OuterVolumeSpecName: "kube-api-access-pfrhc") pod "4675fa0b-da83-4c1e-b083-76c5901ef9d7" (UID: "4675fa0b-da83-4c1e-b083-76c5901ef9d7"). InnerVolumeSpecName "kube-api-access-pfrhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.287996 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data" (OuterVolumeSpecName: "config-data") pod "9a6fd99b-88d0-4818-8fb7-b9ba64097277" (UID: "9a6fd99b-88d0-4818-8fb7-b9ba64097277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.289452 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a6fd99b-88d0-4818-8fb7-b9ba64097277" (UID: "9a6fd99b-88d0-4818-8fb7-b9ba64097277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.290113 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4675fa0b-da83-4c1e-b083-76c5901ef9d7" (UID: "4675fa0b-da83-4c1e-b083-76c5901ef9d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.296773 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data" (OuterVolumeSpecName: "config-data") pod "4675fa0b-da83-4c1e-b083-76c5901ef9d7" (UID: "4675fa0b-da83-4c1e-b083-76c5901ef9d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.325440 4929 scope.go:117] "RemoveContainer" containerID="08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.326112 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018\": container with ID starting with 08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018 not found: ID does not exist" containerID="08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.326277 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018"} err="failed to get container status \"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018\": rpc error: code = NotFound desc = could not find container \"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018\": container with ID starting with 08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.326336 4929 scope.go:117] "RemoveContainer" containerID="0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.326902 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3\": container with ID starting with 0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3 not found: ID does not exist" containerID="0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.326932 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3"} err="failed to get container status \"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3\": rpc error: code = NotFound desc = could not find container \"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3\": container with ID starting with 0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.326972 4929 scope.go:117] "RemoveContainer" containerID="08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.327316 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018"} err="failed to get container status \"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018\": rpc error: code = NotFound desc = could not find container \"08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018\": container with ID starting with 08f26d8599f1a2c2e22da63941b98c7e2ba3bcced06d1a080fda411aee02e018 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.327344 4929 scope.go:117] "RemoveContainer" containerID="0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.327668 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3"} err="failed to get container status \"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3\": rpc error: code = NotFound desc = could not find container \"0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3\": container with ID starting with 0275d1dc04d972773b511dc558c7536f7b609c912e168eeca44a115c2529dab3 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.327692 4929 scope.go:117] "RemoveContainer" containerID="d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362140 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362193 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4675fa0b-da83-4c1e-b083-76c5901ef9d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362217 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2cg9\" (UniqueName: \"kubernetes.io/projected/9a6fd99b-88d0-4818-8fb7-b9ba64097277-kube-api-access-t2cg9\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362234 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a6fd99b-88d0-4818-8fb7-b9ba64097277-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362251 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfrhc\" (UniqueName: \"kubernetes.io/projected/4675fa0b-da83-4c1e-b083-76c5901ef9d7-kube-api-access-pfrhc\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362263 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362276 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4675fa0b-da83-4c1e-b083-76c5901ef9d7-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.362287 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a6fd99b-88d0-4818-8fb7-b9ba64097277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.412061 4929 scope.go:117] "RemoveContainer" containerID="3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.439506 4929 scope.go:117] "RemoveContainer" containerID="d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.440331 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81\": container with ID starting with d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81 not found: ID does not exist" containerID="d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.440367 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81"} err="failed to get container status \"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81\": rpc error: code = NotFound desc = could not find container \"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81\": container with ID starting with d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.440393 4929 scope.go:117] "RemoveContainer" containerID="3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.440707 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7\": container with ID starting with 3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7 not found: ID does not exist" containerID="3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.440735 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7"} err="failed to get container status \"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7\": rpc error: code = NotFound desc = could not find container \"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7\": container with ID starting with 3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.440751 4929 scope.go:117] "RemoveContainer" containerID="d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.440994 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81"} err="failed to get container status \"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81\": rpc error: code = NotFound desc = could not find container \"d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81\": container with ID starting with d7a8079fc43ccf3c9e3da76360dbc17612645f904bc7dcf71c1144a9b8a63c81 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.441013 4929 scope.go:117] "RemoveContainer" containerID="3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.441258 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7"} err="failed to get container status \"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7\": rpc error: code = NotFound desc = could not find container \"3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7\": container with ID starting with 3255fc7c7c6b1194d1d6e0e08beb7be9aaa03dc1e8f3f68c4fdadf1d8a3ab1e7 not found: ID does not exist" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.559210 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.568913 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.584031 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.590878 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.591335 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-log" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591349 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-log" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.591367 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-metadata" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591373 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-metadata" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.591389 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-log" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591397 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-log" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.591425 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-api" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591431 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-api" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591593 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-api" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591611 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-metadata" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591620 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" containerName="nova-api-log" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.591629 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" containerName="nova-metadata-log" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.592633 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.599164 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.601524 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.624849 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.635017 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.636611 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.639881 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.654911 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.734800 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.771804 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggrw\" (UniqueName: \"kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773580 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpmnx\" (UniqueName: \"kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773630 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773670 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773742 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773791 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773816 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.773905 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.875028 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68qn\" (UniqueName: \"kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn\") pod \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.875671 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle\") pod \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.875784 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") pod \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876285 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876333 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876368 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876447 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggrw\" (UniqueName: \"kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876505 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpmnx\" (UniqueName: \"kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876521 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.876547 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.877033 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.878836 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.881740 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.891873 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.893114 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.894304 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn" (OuterVolumeSpecName: "kube-api-access-v68qn") pod "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" (UID: "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff"). InnerVolumeSpecName "kube-api-access-v68qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.896298 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.896843 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggrw\" (UniqueName: \"kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw\") pod \"nova-metadata-0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " pod="openstack/nova-metadata-0" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.897275 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpmnx\" (UniqueName: \"kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx\") pod \"nova-api-0\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " pod="openstack/nova-api-0" Oct 02 12:44:57 crc kubenswrapper[4929]: E1002 12:44:57.904284 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data podName:2baa029c-ae32-4ba7-adc5-0db1f3dd13ff nodeName:}" failed. No retries permitted until 2025-10-02 12:44:58.404259939 +0000 UTC m=+5698.954626293 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data") pod "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" (UID: "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff") : error deleting /var/lib/kubelet/pods/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff/volume-subpaths: remove /var/lib/kubelet/pods/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff/volume-subpaths: no such file or directory Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.907149 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" (UID: "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.977998 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68qn\" (UniqueName: \"kubernetes.io/projected/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-kube-api-access-v68qn\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:57 crc kubenswrapper[4929]: I1002 12:44:57.978365 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.022259 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.032005 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.181020 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4675fa0b-da83-4c1e-b083-76c5901ef9d7" path="/var/lib/kubelet/pods/4675fa0b-da83-4c1e-b083-76c5901ef9d7/volumes" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.182351 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6fd99b-88d0-4818-8fb7-b9ba64097277" path="/var/lib/kubelet/pods/9a6fd99b-88d0-4818-8fb7-b9ba64097277/volumes" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.233576 4929 generic.go:334] "Generic (PLEG): container finished" podID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" containerID="35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a" exitCode=0 Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.233667 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.233694 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff","Type":"ContainerDied","Data":"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a"} Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.234161 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff","Type":"ContainerDied","Data":"4569e097674b5d728ab4cf472881a62d96b461eb4b135c1f7d60c61d9487cdd4"} Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.234187 4929 scope.go:117] "RemoveContainer" containerID="35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.241585 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e94be396-9a29-47d2-9f2a-ff9ab7a08605","Type":"ContainerStarted","Data":"703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa"} Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.241623 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e94be396-9a29-47d2-9f2a-ff9ab7a08605","Type":"ContainerStarted","Data":"c98d29ebfdccc0104c3a71ae10ebb29dcd56f296660b60d1be54e37bb34ee245"} Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.241717 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.265416 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.265394235 podStartE2EDuration="2.265394235s" podCreationTimestamp="2025-10-02 12:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:44:58.257330293 +0000 UTC m=+5698.807696667" watchObservedRunningTime="2025-10-02 12:44:58.265394235 +0000 UTC m=+5698.815760609" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.270980 4929 scope.go:117] "RemoveContainer" containerID="35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a" Oct 02 12:44:58 crc kubenswrapper[4929]: E1002 12:44:58.271406 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a\": container with ID starting with 35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a not found: ID does not exist" containerID="35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.271439 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a"} err="failed to get container status \"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a\": rpc error: code = NotFound desc = could not find container \"35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a\": container with ID starting with 35914b2c60ab88c1ca225816fb64d320329b4d4aa366e8d5468c766412ca4f1a not found: ID does not exist" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.488424 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") pod \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\" (UID: \"2baa029c-ae32-4ba7-adc5-0db1f3dd13ff\") " Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.494371 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data" (OuterVolumeSpecName: "config-data") pod "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" (UID: "2baa029c-ae32-4ba7-adc5-0db1f3dd13ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.548020 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.550031 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.580536 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.593373 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.594590 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.599026 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.615795 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.639430 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: E1002 12:44:58.640828 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" containerName="nova-scheduler-scheduler" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.640851 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" containerName="nova-scheduler-scheduler" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.641131 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" containerName="nova-scheduler-scheduler" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.642863 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.653388 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.659444 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.696835 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzxm\" (UniqueName: \"kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.696910 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.697047 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.750339 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.817858 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzxm\" (UniqueName: \"kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.818419 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.818483 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.823718 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.828152 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.834940 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.840255 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="dnsmasq-dns" containerID="cri-o://b40a250f80ae363a7b44880a9f84e07de3cd45804238dc2868f5a718ec1da41f" gracePeriod=10 Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.879058 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzxm\" (UniqueName: \"kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm\") pod \"nova-scheduler-0\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " pod="openstack/nova-scheduler-0" Oct 02 12:44:58 crc kubenswrapper[4929]: I1002 12:44:58.974313 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.264300 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerStarted","Data":"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8"} Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.264634 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerStarted","Data":"13d179c21b3dd57a0fbd8b13f0f33df0aa58d4e463cf6417ca4fb0d03fe5f622"} Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.267864 4929 generic.go:334] "Generic (PLEG): container finished" podID="79f52949-db23-4922-adf0-eb1122fe74a5" containerID="b40a250f80ae363a7b44880a9f84e07de3cd45804238dc2868f5a718ec1da41f" exitCode=0 Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.267997 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" event={"ID":"79f52949-db23-4922-adf0-eb1122fe74a5","Type":"ContainerDied","Data":"b40a250f80ae363a7b44880a9f84e07de3cd45804238dc2868f5a718ec1da41f"} Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.271350 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerStarted","Data":"3ef8b05ab1e0ed5a2de4d085bb046eee751e1981d1e92dd507f2581c49d91524"} Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.287882 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.639833 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:44:59 crc kubenswrapper[4929]: W1002 12:44:59.651308 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc66f23ec_bd44_4452_9a43_cb68dea4145d.slice/crio-948bd8f168a0d08ab29220942f11b9dd4b2729a6f0a6288394b70d9223117f8a WatchSource:0}: Error finding container 948bd8f168a0d08ab29220942f11b9dd4b2729a6f0a6288394b70d9223117f8a: Status 404 returned error can't find the container with id 948bd8f168a0d08ab29220942f11b9dd4b2729a6f0a6288394b70d9223117f8a Oct 02 12:44:59 crc kubenswrapper[4929]: I1002 12:44:59.934305 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.057666 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc\") pod \"79f52949-db23-4922-adf0-eb1122fe74a5\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.057760 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb\") pod \"79f52949-db23-4922-adf0-eb1122fe74a5\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.057870 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb\") pod \"79f52949-db23-4922-adf0-eb1122fe74a5\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.057932 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d986\" (UniqueName: \"kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986\") pod \"79f52949-db23-4922-adf0-eb1122fe74a5\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.058024 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config\") pod \"79f52949-db23-4922-adf0-eb1122fe74a5\" (UID: \"79f52949-db23-4922-adf0-eb1122fe74a5\") " Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.068208 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986" (OuterVolumeSpecName: "kube-api-access-5d986") pod "79f52949-db23-4922-adf0-eb1122fe74a5" (UID: "79f52949-db23-4922-adf0-eb1122fe74a5"). InnerVolumeSpecName "kube-api-access-5d986". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.136805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79f52949-db23-4922-adf0-eb1122fe74a5" (UID: "79f52949-db23-4922-adf0-eb1122fe74a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.138524 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79f52949-db23-4922-adf0-eb1122fe74a5" (UID: "79f52949-db23-4922-adf0-eb1122fe74a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.148211 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79f52949-db23-4922-adf0-eb1122fe74a5" (UID: "79f52949-db23-4922-adf0-eb1122fe74a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.155157 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm"] Oct 02 12:45:00 crc kubenswrapper[4929]: E1002 12:45:00.155980 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="dnsmasq-dns" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.156000 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="dnsmasq-dns" Oct 02 12:45:00 crc kubenswrapper[4929]: E1002 12:45:00.156037 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="init" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.156065 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="init" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.156402 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" containerName="dnsmasq-dns" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.156538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config" (OuterVolumeSpecName: "config") pod "79f52949-db23-4922-adf0-eb1122fe74a5" (UID: "79f52949-db23-4922-adf0-eb1122fe74a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.158100 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.160862 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.160894 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.160908 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.160937 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79f52949-db23-4922-adf0-eb1122fe74a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.160950 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d986\" (UniqueName: \"kubernetes.io/projected/79f52949-db23-4922-adf0-eb1122fe74a5-kube-api-access-5d986\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.186162 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.186410 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.186668 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2baa029c-ae32-4ba7-adc5-0db1f3dd13ff" path="/var/lib/kubelet/pods/2baa029c-ae32-4ba7-adc5-0db1f3dd13ff/volumes" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.188216 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm"] Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.263607 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dnjz\" (UniqueName: \"kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.263870 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.263982 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.306952 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerStarted","Data":"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52"} Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.313285 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c66f23ec-bd44-4452-9a43-cb68dea4145d","Type":"ContainerStarted","Data":"948bd8f168a0d08ab29220942f11b9dd4b2729a6f0a6288394b70d9223117f8a"} Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.323841 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.325236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b4f4cfbf-b59xq" event={"ID":"79f52949-db23-4922-adf0-eb1122fe74a5","Type":"ContainerDied","Data":"97ad239bf612d5d3f2803c6a25678dc27c3bf1cd062e9f76f0c01152cdb71b56"} Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.325296 4929 scope.go:117] "RemoveContainer" containerID="b40a250f80ae363a7b44880a9f84e07de3cd45804238dc2868f5a718ec1da41f" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.334173 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.334149601 podStartE2EDuration="3.334149601s" podCreationTimestamp="2025-10-02 12:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:00.325871722 +0000 UTC m=+5700.876238096" watchObservedRunningTime="2025-10-02 12:45:00.334149601 +0000 UTC m=+5700.884515975" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.334220 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerStarted","Data":"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81"} Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.334582 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerStarted","Data":"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b"} Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.352560 4929 scope.go:117] "RemoveContainer" containerID="2c9e3924094901738b5b42f2d198b3a80e24ebd1bd6c40e5d0c395166ca38d9f" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.361001 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.365766 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dnjz\" (UniqueName: \"kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.365897 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.365934 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.367884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.372976 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.374028 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56b4f4cfbf-b59xq"] Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.375094 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.375075721 podStartE2EDuration="3.375075721s" podCreationTimestamp="2025-10-02 12:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:00.370813468 +0000 UTC m=+5700.921179832" watchObservedRunningTime="2025-10-02 12:45:00.375075721 +0000 UTC m=+5700.925442085" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.386467 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dnjz\" (UniqueName: \"kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz\") pod \"collect-profiles-29323485-xtjkm\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:00 crc kubenswrapper[4929]: I1002 12:45:00.535460 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.000155 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm"] Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.344713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" event={"ID":"76c46c1f-700a-45a7-ab95-42782085e0ed","Type":"ContainerStarted","Data":"9cec864a1d9a9f1dd02769dd89c1de8cca7c45b7a4d6ce3eebecbb83b39fb157"} Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.344760 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" event={"ID":"76c46c1f-700a-45a7-ab95-42782085e0ed","Type":"ContainerStarted","Data":"5a21cbccf2698255e393a584ca87479ac25f550cc0bcdd2ac4f01344802f8fe7"} Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.347078 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c66f23ec-bd44-4452-9a43-cb68dea4145d","Type":"ContainerStarted","Data":"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613"} Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.364189 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" podStartSLOduration=1.364164917 podStartE2EDuration="1.364164917s" podCreationTimestamp="2025-10-02 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:01.361440149 +0000 UTC m=+5701.911806523" watchObservedRunningTime="2025-10-02 12:45:01.364164917 +0000 UTC m=+5701.914531291" Oct 02 12:45:01 crc kubenswrapper[4929]: I1002 12:45:01.389019 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.388998123 podStartE2EDuration="3.388998123s" podCreationTimestamp="2025-10-02 12:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:01.378774589 +0000 UTC m=+5701.929140963" watchObservedRunningTime="2025-10-02 12:45:01.388998123 +0000 UTC m=+5701.939364487" Oct 02 12:45:02 crc kubenswrapper[4929]: I1002 12:45:02.168470 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f52949-db23-4922-adf0-eb1122fe74a5" path="/var/lib/kubelet/pods/79f52949-db23-4922-adf0-eb1122fe74a5/volumes" Oct 02 12:45:02 crc kubenswrapper[4929]: I1002 12:45:02.361445 4929 generic.go:334] "Generic (PLEG): container finished" podID="76c46c1f-700a-45a7-ab95-42782085e0ed" containerID="9cec864a1d9a9f1dd02769dd89c1de8cca7c45b7a4d6ce3eebecbb83b39fb157" exitCode=0 Oct 02 12:45:02 crc kubenswrapper[4929]: I1002 12:45:02.361566 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" event={"ID":"76c46c1f-700a-45a7-ab95-42782085e0ed","Type":"ContainerDied","Data":"9cec864a1d9a9f1dd02769dd89c1de8cca7c45b7a4d6ce3eebecbb83b39fb157"} Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.022470 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.022524 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.685565 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.840144 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume\") pod \"76c46c1f-700a-45a7-ab95-42782085e0ed\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.840272 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume\") pod \"76c46c1f-700a-45a7-ab95-42782085e0ed\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.840457 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dnjz\" (UniqueName: \"kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz\") pod \"76c46c1f-700a-45a7-ab95-42782085e0ed\" (UID: \"76c46c1f-700a-45a7-ab95-42782085e0ed\") " Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.841180 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume" (OuterVolumeSpecName: "config-volume") pod "76c46c1f-700a-45a7-ab95-42782085e0ed" (UID: "76c46c1f-700a-45a7-ab95-42782085e0ed"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.841887 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c46c1f-700a-45a7-ab95-42782085e0ed-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.846393 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz" (OuterVolumeSpecName: "kube-api-access-7dnjz") pod "76c46c1f-700a-45a7-ab95-42782085e0ed" (UID: "76c46c1f-700a-45a7-ab95-42782085e0ed"). InnerVolumeSpecName "kube-api-access-7dnjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.846416 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76c46c1f-700a-45a7-ab95-42782085e0ed" (UID: "76c46c1f-700a-45a7-ab95-42782085e0ed"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.943599 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dnjz\" (UniqueName: \"kubernetes.io/projected/76c46c1f-700a-45a7-ab95-42782085e0ed-kube-api-access-7dnjz\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.943645 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c46c1f-700a-45a7-ab95-42782085e0ed-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:03 crc kubenswrapper[4929]: I1002 12:45:03.975161 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:45:04 crc kubenswrapper[4929]: E1002 12:45:04.252432 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c46c1f_700a_45a7_ab95_42782085e0ed.slice/crio-5a21cbccf2698255e393a584ca87479ac25f550cc0bcdd2ac4f01344802f8fe7\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:04 crc kubenswrapper[4929]: I1002 12:45:04.383562 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" event={"ID":"76c46c1f-700a-45a7-ab95-42782085e0ed","Type":"ContainerDied","Data":"5a21cbccf2698255e393a584ca87479ac25f550cc0bcdd2ac4f01344802f8fe7"} Oct 02 12:45:04 crc kubenswrapper[4929]: I1002 12:45:04.383613 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm" Oct 02 12:45:04 crc kubenswrapper[4929]: I1002 12:45:04.383623 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a21cbccf2698255e393a584ca87479ac25f550cc0bcdd2ac4f01344802f8fe7" Oct 02 12:45:04 crc kubenswrapper[4929]: I1002 12:45:04.424808 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc"] Oct 02 12:45:04 crc kubenswrapper[4929]: I1002 12:45:04.433900 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-4grmc"] Oct 02 12:45:06 crc kubenswrapper[4929]: I1002 12:45:06.168384 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f69a79f-60ac-4096-aeb0-0f1edb98dd00" path="/var/lib/kubelet/pods/8f69a79f-60ac-4096-aeb0-0f1edb98dd00/volumes" Oct 02 12:45:06 crc kubenswrapper[4929]: I1002 12:45:06.633424 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.100864 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-x8w2p"] Oct 02 12:45:07 crc kubenswrapper[4929]: E1002 12:45:07.101227 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c46c1f-700a-45a7-ab95-42782085e0ed" containerName="collect-profiles" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.101242 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c46c1f-700a-45a7-ab95-42782085e0ed" containerName="collect-profiles" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.101446 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c46c1f-700a-45a7-ab95-42782085e0ed" containerName="collect-profiles" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.102039 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.108138 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.108900 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.136024 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8w2p"] Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.303138 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.303653 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfx9\" (UniqueName: \"kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.304537 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.304702 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.406737 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.406778 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.406849 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.406904 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfx9\" (UniqueName: \"kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.412644 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.412900 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.412911 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.430021 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfx9\" (UniqueName: \"kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9\") pod \"nova-cell1-cell-mapping-x8w2p\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.440377 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:07 crc kubenswrapper[4929]: I1002 12:45:07.707264 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8w2p"] Oct 02 12:45:07 crc kubenswrapper[4929]: W1002 12:45:07.714992 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d855a0_eeae_434a_afbd_dc37924f454f.slice/crio-7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976 WatchSource:0}: Error finding container 7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976: Status 404 returned error can't find the container with id 7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976 Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.022996 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.023295 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.033646 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.033699 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.428032 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8w2p" event={"ID":"34d855a0-eeae-434a-afbd-dc37924f454f","Type":"ContainerStarted","Data":"9a9f31e9a027290df783b4e7a44c54fe1d1355201d4e3e480f3ef40f1a34afe6"} Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.428088 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8w2p" event={"ID":"34d855a0-eeae-434a-afbd-dc37924f454f","Type":"ContainerStarted","Data":"7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976"} Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.459833 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-x8w2p" podStartSLOduration=1.459813884 podStartE2EDuration="1.459813884s" podCreationTimestamp="2025-10-02 12:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:08.446797379 +0000 UTC m=+5708.997163743" watchObservedRunningTime="2025-10-02 12:45:08.459813884 +0000 UTC m=+5709.010180248" Oct 02 12:45:08 crc kubenswrapper[4929]: I1002 12:45:08.975066 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.018156 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.191302 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.191313 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.191464 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.68:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.192001 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.68:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:09 crc kubenswrapper[4929]: I1002 12:45:09.480474 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:45:13 crc kubenswrapper[4929]: I1002 12:45:13.490628 4929 generic.go:334] "Generic (PLEG): container finished" podID="34d855a0-eeae-434a-afbd-dc37924f454f" containerID="9a9f31e9a027290df783b4e7a44c54fe1d1355201d4e3e480f3ef40f1a34afe6" exitCode=0 Oct 02 12:45:13 crc kubenswrapper[4929]: I1002 12:45:13.490690 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8w2p" event={"ID":"34d855a0-eeae-434a-afbd-dc37924f454f","Type":"ContainerDied","Data":"9a9f31e9a027290df783b4e7a44c54fe1d1355201d4e3e480f3ef40f1a34afe6"} Oct 02 12:45:14 crc kubenswrapper[4929]: E1002 12:45:14.485562 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:14 crc kubenswrapper[4929]: I1002 12:45:14.741519 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:45:14 crc kubenswrapper[4929]: I1002 12:45:14.741572 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:45:14 crc kubenswrapper[4929]: I1002 12:45:14.936467 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.064902 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data\") pod \"34d855a0-eeae-434a-afbd-dc37924f454f\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.065271 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle\") pod \"34d855a0-eeae-434a-afbd-dc37924f454f\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.065548 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts\") pod \"34d855a0-eeae-434a-afbd-dc37924f454f\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.065707 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlfx9\" (UniqueName: \"kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9\") pod \"34d855a0-eeae-434a-afbd-dc37924f454f\" (UID: \"34d855a0-eeae-434a-afbd-dc37924f454f\") " Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.073151 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9" (OuterVolumeSpecName: "kube-api-access-mlfx9") pod "34d855a0-eeae-434a-afbd-dc37924f454f" (UID: "34d855a0-eeae-434a-afbd-dc37924f454f"). InnerVolumeSpecName "kube-api-access-mlfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.088895 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts" (OuterVolumeSpecName: "scripts") pod "34d855a0-eeae-434a-afbd-dc37924f454f" (UID: "34d855a0-eeae-434a-afbd-dc37924f454f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.089341 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34d855a0-eeae-434a-afbd-dc37924f454f" (UID: "34d855a0-eeae-434a-afbd-dc37924f454f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.102128 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data" (OuterVolumeSpecName: "config-data") pod "34d855a0-eeae-434a-afbd-dc37924f454f" (UID: "34d855a0-eeae-434a-afbd-dc37924f454f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.167355 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlfx9\" (UniqueName: \"kubernetes.io/projected/34d855a0-eeae-434a-afbd-dc37924f454f-kube-api-access-mlfx9\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.167395 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.167409 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.167418 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d855a0-eeae-434a-afbd-dc37924f454f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.509789 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-x8w2p" event={"ID":"34d855a0-eeae-434a-afbd-dc37924f454f","Type":"ContainerDied","Data":"7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976"} Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.509860 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7193bd4ae88bdc1f1f4135653fbafda9989a7ff6d14d28d01b5e60e993481976" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.509860 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-x8w2p" Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.773989 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.774549 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-log" containerID="cri-o://5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81" gracePeriod=30 Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.775101 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-api" containerID="cri-o://6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b" gracePeriod=30 Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.790162 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.790424 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c66f23ec-bd44-4452-9a43-cb68dea4145d" containerName="nova-scheduler-scheduler" containerID="cri-o://ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613" gracePeriod=30 Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.802211 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.803732 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-metadata" containerID="cri-o://b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52" gracePeriod=30 Oct 02 12:45:15 crc kubenswrapper[4929]: I1002 12:45:15.804701 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-log" containerID="cri-o://e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8" gracePeriod=30 Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.376185 4929 scope.go:117] "RemoveContainer" containerID="d6b9730bc85ccd2ae176f6b970ebd605b22580c4201b12f95d560e756e37fc94" Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.522691 4929 generic.go:334] "Generic (PLEG): container finished" podID="14723f34-bd42-4857-836e-35470e2645a0" containerID="e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8" exitCode=143 Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.522861 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerDied","Data":"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8"} Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.525544 4929 generic.go:334] "Generic (PLEG): container finished" podID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerID="5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81" exitCode=143 Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.525590 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerDied","Data":"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81"} Oct 02 12:45:16 crc kubenswrapper[4929]: I1002 12:45:16.948079 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.101306 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdzxm\" (UniqueName: \"kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm\") pod \"c66f23ec-bd44-4452-9a43-cb68dea4145d\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.101451 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle\") pod \"c66f23ec-bd44-4452-9a43-cb68dea4145d\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.101508 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data\") pod \"c66f23ec-bd44-4452-9a43-cb68dea4145d\" (UID: \"c66f23ec-bd44-4452-9a43-cb68dea4145d\") " Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.109021 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm" (OuterVolumeSpecName: "kube-api-access-qdzxm") pod "c66f23ec-bd44-4452-9a43-cb68dea4145d" (UID: "c66f23ec-bd44-4452-9a43-cb68dea4145d"). InnerVolumeSpecName "kube-api-access-qdzxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.128848 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data" (OuterVolumeSpecName: "config-data") pod "c66f23ec-bd44-4452-9a43-cb68dea4145d" (UID: "c66f23ec-bd44-4452-9a43-cb68dea4145d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.132679 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c66f23ec-bd44-4452-9a43-cb68dea4145d" (UID: "c66f23ec-bd44-4452-9a43-cb68dea4145d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.204582 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdzxm\" (UniqueName: \"kubernetes.io/projected/c66f23ec-bd44-4452-9a43-cb68dea4145d-kube-api-access-qdzxm\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.204626 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.204640 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66f23ec-bd44-4452-9a43-cb68dea4145d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.533765 4929 generic.go:334] "Generic (PLEG): container finished" podID="c66f23ec-bd44-4452-9a43-cb68dea4145d" containerID="ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613" exitCode=0 Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.533816 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.533828 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c66f23ec-bd44-4452-9a43-cb68dea4145d","Type":"ContainerDied","Data":"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613"} Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.533856 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c66f23ec-bd44-4452-9a43-cb68dea4145d","Type":"ContainerDied","Data":"948bd8f168a0d08ab29220942f11b9dd4b2729a6f0a6288394b70d9223117f8a"} Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.533871 4929 scope.go:117] "RemoveContainer" containerID="ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.555155 4929 scope.go:117] "RemoveContainer" containerID="ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613" Oct 02 12:45:17 crc kubenswrapper[4929]: E1002 12:45:17.555572 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613\": container with ID starting with ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613 not found: ID does not exist" containerID="ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.555614 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613"} err="failed to get container status \"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613\": rpc error: code = NotFound desc = could not find container \"ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613\": container with ID starting with ab36c8eac7401e3ac3fcbe4455dd849c20f3610bf19da49b2f3cd69bcda52613 not found: ID does not exist" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.581129 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.590454 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.598736 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:17 crc kubenswrapper[4929]: E1002 12:45:17.599221 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66f23ec-bd44-4452-9a43-cb68dea4145d" containerName="nova-scheduler-scheduler" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.599242 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66f23ec-bd44-4452-9a43-cb68dea4145d" containerName="nova-scheduler-scheduler" Oct 02 12:45:17 crc kubenswrapper[4929]: E1002 12:45:17.599285 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d855a0-eeae-434a-afbd-dc37924f454f" containerName="nova-manage" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.599329 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d855a0-eeae-434a-afbd-dc37924f454f" containerName="nova-manage" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.599551 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66f23ec-bd44-4452-9a43-cb68dea4145d" containerName="nova-scheduler-scheduler" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.599588 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d855a0-eeae-434a-afbd-dc37924f454f" containerName="nova-manage" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.600412 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.602442 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.609256 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.721225 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.721276 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.721342 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghn6d\" (UniqueName: \"kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.822866 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.822927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.823008 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghn6d\" (UniqueName: \"kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.826517 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.836117 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.840728 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghn6d\" (UniqueName: \"kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d\") pod \"nova-scheduler-0\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " pod="openstack/nova-scheduler-0" Oct 02 12:45:17 crc kubenswrapper[4929]: I1002 12:45:17.923348 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:45:18 crc kubenswrapper[4929]: I1002 12:45:18.168399 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66f23ec-bd44-4452-9a43-cb68dea4145d" path="/var/lib/kubelet/pods/c66f23ec-bd44-4452-9a43-cb68dea4145d/volumes" Oct 02 12:45:18 crc kubenswrapper[4929]: I1002 12:45:18.359165 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:45:18 crc kubenswrapper[4929]: I1002 12:45:18.545647 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e0cbaa7-a006-494f-baa9-a13306e8edb4","Type":"ContainerStarted","Data":"88aaac89aed59628719b23c004f626be01c2dd163e668df55c196021d9fa2717"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.383211 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.390442 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.555472 4929 generic.go:334] "Generic (PLEG): container finished" podID="14723f34-bd42-4857-836e-35470e2645a0" containerID="b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52" exitCode=0 Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.555536 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.556111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerDied","Data":"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.556468 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"14723f34-bd42-4857-836e-35470e2645a0","Type":"ContainerDied","Data":"13d179c21b3dd57a0fbd8b13f0f33df0aa58d4e463cf6417ca4fb0d03fe5f622"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.556534 4929 scope.go:117] "RemoveContainer" containerID="b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.557915 4929 generic.go:334] "Generic (PLEG): container finished" podID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerID="6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b" exitCode=0 Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.557979 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.557991 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerDied","Data":"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.559335 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596f69b-3e4d-49f7-b85b-7508b7605c46","Type":"ContainerDied","Data":"3ef8b05ab1e0ed5a2de4d085bb046eee751e1981d1e92dd507f2581c49d91524"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.560419 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e0cbaa7-a006-494f-baa9-a13306e8edb4","Type":"ContainerStarted","Data":"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3"} Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.578137 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle\") pod \"14723f34-bd42-4857-836e-35470e2645a0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.578385 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpmnx\" (UniqueName: \"kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx\") pod \"5596f69b-3e4d-49f7-b85b-7508b7605c46\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.579727 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs\") pod \"5596f69b-3e4d-49f7-b85b-7508b7605c46\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.579817 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nggrw\" (UniqueName: \"kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw\") pod \"14723f34-bd42-4857-836e-35470e2645a0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.579842 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data\") pod \"5596f69b-3e4d-49f7-b85b-7508b7605c46\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.579938 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs\") pod \"14723f34-bd42-4857-836e-35470e2645a0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.579980 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data\") pod \"14723f34-bd42-4857-836e-35470e2645a0\" (UID: \"14723f34-bd42-4857-836e-35470e2645a0\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.580029 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle\") pod \"5596f69b-3e4d-49f7-b85b-7508b7605c46\" (UID: \"5596f69b-3e4d-49f7-b85b-7508b7605c46\") " Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.580056 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.580044735 podStartE2EDuration="2.580044735s" podCreationTimestamp="2025-10-02 12:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:19.578698776 +0000 UTC m=+5720.129065150" watchObservedRunningTime="2025-10-02 12:45:19.580044735 +0000 UTC m=+5720.130411099" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.580725 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs" (OuterVolumeSpecName: "logs") pod "14723f34-bd42-4857-836e-35470e2645a0" (UID: "14723f34-bd42-4857-836e-35470e2645a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.580810 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs" (OuterVolumeSpecName: "logs") pod "5596f69b-3e4d-49f7-b85b-7508b7605c46" (UID: "5596f69b-3e4d-49f7-b85b-7508b7605c46"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.581527 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14723f34-bd42-4857-836e-35470e2645a0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.581556 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596f69b-3e4d-49f7-b85b-7508b7605c46-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.583175 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx" (OuterVolumeSpecName: "kube-api-access-gpmnx") pod "5596f69b-3e4d-49f7-b85b-7508b7605c46" (UID: "5596f69b-3e4d-49f7-b85b-7508b7605c46"). InnerVolumeSpecName "kube-api-access-gpmnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.585299 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw" (OuterVolumeSpecName: "kube-api-access-nggrw") pod "14723f34-bd42-4857-836e-35470e2645a0" (UID: "14723f34-bd42-4857-836e-35470e2645a0"). InnerVolumeSpecName "kube-api-access-nggrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.586082 4929 scope.go:117] "RemoveContainer" containerID="e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.603273 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data" (OuterVolumeSpecName: "config-data") pod "5596f69b-3e4d-49f7-b85b-7508b7605c46" (UID: "5596f69b-3e4d-49f7-b85b-7508b7605c46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.605591 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5596f69b-3e4d-49f7-b85b-7508b7605c46" (UID: "5596f69b-3e4d-49f7-b85b-7508b7605c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.606245 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14723f34-bd42-4857-836e-35470e2645a0" (UID: "14723f34-bd42-4857-836e-35470e2645a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.608778 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data" (OuterVolumeSpecName: "config-data") pod "14723f34-bd42-4857-836e-35470e2645a0" (UID: "14723f34-bd42-4857-836e-35470e2645a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.683945 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.684033 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.684047 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpmnx\" (UniqueName: \"kubernetes.io/projected/5596f69b-3e4d-49f7-b85b-7508b7605c46-kube-api-access-gpmnx\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.684062 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nggrw\" (UniqueName: \"kubernetes.io/projected/14723f34-bd42-4857-836e-35470e2645a0-kube-api-access-nggrw\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.684075 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596f69b-3e4d-49f7-b85b-7508b7605c46-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.684090 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14723f34-bd42-4857-836e-35470e2645a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.692643 4929 scope.go:117] "RemoveContainer" containerID="b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.693276 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52\": container with ID starting with b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52 not found: ID does not exist" containerID="b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.693381 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52"} err="failed to get container status \"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52\": rpc error: code = NotFound desc = could not find container \"b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52\": container with ID starting with b853837893ff334238290c7f75c07a50c8d108dd01ba199267a75357a4684b52 not found: ID does not exist" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.693421 4929 scope.go:117] "RemoveContainer" containerID="e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.693843 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8\": container with ID starting with e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8 not found: ID does not exist" containerID="e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.693880 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8"} err="failed to get container status \"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8\": rpc error: code = NotFound desc = could not find container \"e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8\": container with ID starting with e5f9ca255b4b6c0eb2f60ba6b5bed27c093c451b7b756ff7d5265cace83ecaf8 not found: ID does not exist" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.693918 4929 scope.go:117] "RemoveContainer" containerID="6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.716700 4929 scope.go:117] "RemoveContainer" containerID="5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.740685 4929 scope.go:117] "RemoveContainer" containerID="6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.741156 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b\": container with ID starting with 6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b not found: ID does not exist" containerID="6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.741214 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b"} err="failed to get container status \"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b\": rpc error: code = NotFound desc = could not find container \"6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b\": container with ID starting with 6115651278e3b3033684026456aa4ccda9b69679edaf151fb1515923f5f45f2b not found: ID does not exist" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.741270 4929 scope.go:117] "RemoveContainer" containerID="5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.741698 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81\": container with ID starting with 5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81 not found: ID does not exist" containerID="5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.741747 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81"} err="failed to get container status \"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81\": rpc error: code = NotFound desc = could not find container \"5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81\": container with ID starting with 5b5447de3de2bd7f984ffa151aff22d1268262dbc091069f38eae0bbf9bd0c81 not found: ID does not exist" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.895523 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.917671 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.936296 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.943993 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957073 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.957542 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-log" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957562 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-log" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.957580 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-metadata" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957587 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-metadata" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.957614 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-api" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957621 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-api" Oct 02 12:45:19 crc kubenswrapper[4929]: E1002 12:45:19.957642 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-log" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957648 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-log" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957813 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-log" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957830 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-log" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957842 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="14723f34-bd42-4857-836e-35470e2645a0" containerName="nova-metadata-metadata" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.957861 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" containerName="nova-api-api" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.958951 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.961303 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.966566 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.968910 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.971243 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.980297 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:19 crc kubenswrapper[4929]: I1002 12:45:19.995474 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091539 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091592 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091629 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091668 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rpr\" (UniqueName: \"kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091700 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091717 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091764 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24k2g\" (UniqueName: \"kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.091802 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.167437 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14723f34-bd42-4857-836e-35470e2645a0" path="/var/lib/kubelet/pods/14723f34-bd42-4857-836e-35470e2645a0/volumes" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.168130 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5596f69b-3e4d-49f7-b85b-7508b7605c46" path="/var/lib/kubelet/pods/5596f69b-3e4d-49f7-b85b-7508b7605c46/volumes" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.193997 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194072 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194118 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rpr\" (UniqueName: \"kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194152 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194168 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194188 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24k2g\" (UniqueName: \"kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194224 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194288 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.194409 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.195019 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.198120 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.198548 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.210855 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.211975 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.219001 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24k2g\" (UniqueName: \"kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g\") pod \"nova-metadata-0\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.224211 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rpr\" (UniqueName: \"kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr\") pod \"nova-api-0\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.283523 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.293290 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.558485 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:45:20 crc kubenswrapper[4929]: W1002 12:45:20.564012 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb37045c_cb4a_45f1_b530_a0da6442becd.slice/crio-c0470a5ac14e8fa175d7ed2f8c4cc24371020ef26dfd9d63325aa4a8c11512b3 WatchSource:0}: Error finding container c0470a5ac14e8fa175d7ed2f8c4cc24371020ef26dfd9d63325aa4a8c11512b3: Status 404 returned error can't find the container with id c0470a5ac14e8fa175d7ed2f8c4cc24371020ef26dfd9d63325aa4a8c11512b3 Oct 02 12:45:20 crc kubenswrapper[4929]: I1002 12:45:20.871752 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:45:20 crc kubenswrapper[4929]: W1002 12:45:20.878100 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e86650_6dc6_42d4_9c1d_879f111ff9b2.slice/crio-d396728734ac862ab3ab7da2365b8c8e455f57d1356dfa604d03c7e04292c137 WatchSource:0}: Error finding container d396728734ac862ab3ab7da2365b8c8e455f57d1356dfa604d03c7e04292c137: Status 404 returned error can't find the container with id d396728734ac862ab3ab7da2365b8c8e455f57d1356dfa604d03c7e04292c137 Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.586006 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerStarted","Data":"84380993db879437c54629cc1e449cc7687f9263ae86ca84c03ed10446158707"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.586058 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerStarted","Data":"a1f761566b9cf0369910eefb317521aee955169d792d415a49d58e4e3724d248"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.586071 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerStarted","Data":"d396728734ac862ab3ab7da2365b8c8e455f57d1356dfa604d03c7e04292c137"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.588567 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerStarted","Data":"ff3905119011c6ae9f0adbd8884839280509cab73d15f363fc97a964b1269082"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.588629 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerStarted","Data":"7f9b67e5d7fce2e8c21ab8c05c80bf37b31fa8a4ddcdcccfa6ed95c0fbe1c226"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.588642 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerStarted","Data":"c0470a5ac14e8fa175d7ed2f8c4cc24371020ef26dfd9d63325aa4a8c11512b3"} Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.603791 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6037741629999998 podStartE2EDuration="2.603774163s" podCreationTimestamp="2025-10-02 12:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:21.60229283 +0000 UTC m=+5722.152659194" watchObservedRunningTime="2025-10-02 12:45:21.603774163 +0000 UTC m=+5722.154140527" Oct 02 12:45:21 crc kubenswrapper[4929]: I1002 12:45:21.648610 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.648561195 podStartE2EDuration="2.648561195s" podCreationTimestamp="2025-10-02 12:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:21.631824162 +0000 UTC m=+5722.182190526" watchObservedRunningTime="2025-10-02 12:45:21.648561195 +0000 UTC m=+5722.198927559" Oct 02 12:45:22 crc kubenswrapper[4929]: I1002 12:45:22.923757 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:45:24 crc kubenswrapper[4929]: E1002 12:45:24.739204 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:25 crc kubenswrapper[4929]: I1002 12:45:25.284706 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:45:25 crc kubenswrapper[4929]: I1002 12:45:25.285018 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:45:27 crc kubenswrapper[4929]: I1002 12:45:27.924043 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:45:27 crc kubenswrapper[4929]: I1002 12:45:27.950750 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:45:28 crc kubenswrapper[4929]: I1002 12:45:28.670519 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:45:30 crc kubenswrapper[4929]: I1002 12:45:30.283907 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:45:30 crc kubenswrapper[4929]: I1002 12:45:30.283974 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:45:30 crc kubenswrapper[4929]: I1002 12:45:30.293475 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:45:30 crc kubenswrapper[4929]: I1002 12:45:30.293531 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:45:31 crc kubenswrapper[4929]: I1002 12:45:31.451256 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:31 crc kubenswrapper[4929]: I1002 12:45:31.451309 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:31 crc kubenswrapper[4929]: I1002 12:45:31.451358 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:31 crc kubenswrapper[4929]: I1002 12:45:31.451376 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.74:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:45:34 crc kubenswrapper[4929]: E1002 12:45:34.959345 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.286275 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.286754 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.288005 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.289513 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.297188 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.297711 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.297836 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.308685 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.762000 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.766431 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.925162 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.926632 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:40 crc kubenswrapper[4929]: I1002 12:45:40.949482 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.066125 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.066390 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.066591 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.066720 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwsz\" (UniqueName: \"kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.066825 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169028 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169340 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwsz\" (UniqueName: \"kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169492 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169595 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169760 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.169859 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.170354 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.170521 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.170622 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.186864 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwsz\" (UniqueName: \"kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz\") pod \"dnsmasq-dns-759fbf6bbf-q8jqx\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.246988 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.635239 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:45:41 crc kubenswrapper[4929]: I1002 12:45:41.772654 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" event={"ID":"d1f53255-170f-4bcd-be6c-9b48995ab571","Type":"ContainerStarted","Data":"94f93174b702a4cf054abe57d1c5417102845dc33993b96d4b2e7e56168dbe1c"} Oct 02 12:45:42 crc kubenswrapper[4929]: I1002 12:45:42.780086 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerID="18b0590ac5a308d3d430b9a0be30e2c95b3f0341050860c9f0d92afd83d6c2d4" exitCode=0 Oct 02 12:45:42 crc kubenswrapper[4929]: I1002 12:45:42.780133 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" event={"ID":"d1f53255-170f-4bcd-be6c-9b48995ab571","Type":"ContainerDied","Data":"18b0590ac5a308d3d430b9a0be30e2c95b3f0341050860c9f0d92afd83d6c2d4"} Oct 02 12:45:43 crc kubenswrapper[4929]: I1002 12:45:43.799819 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" event={"ID":"d1f53255-170f-4bcd-be6c-9b48995ab571","Type":"ContainerStarted","Data":"a7914aba29d9d75a4cee3239e493b4ffd797bc451af1714c92534269b4f59d3c"} Oct 02 12:45:43 crc kubenswrapper[4929]: I1002 12:45:43.800122 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:43 crc kubenswrapper[4929]: I1002 12:45:43.828110 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" podStartSLOduration=3.828081976 podStartE2EDuration="3.828081976s" podCreationTimestamp="2025-10-02 12:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:43.823935697 +0000 UTC m=+5744.374302071" watchObservedRunningTime="2025-10-02 12:45:43.828081976 +0000 UTC m=+5744.378448350" Oct 02 12:45:44 crc kubenswrapper[4929]: I1002 12:45:44.736633 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:45:44 crc kubenswrapper[4929]: I1002 12:45:44.737380 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:45:45 crc kubenswrapper[4929]: E1002 12:45:45.201258 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.248692 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.318347 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.318935 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="dnsmasq-dns" containerID="cri-o://c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5" gracePeriod=10 Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.817221 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.868552 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config\") pod \"2257267d-e6ea-45bb-b3c4-a30267fc9147\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.868694 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb\") pod \"2257267d-e6ea-45bb-b3c4-a30267fc9147\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.868774 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb\") pod \"2257267d-e6ea-45bb-b3c4-a30267fc9147\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.868836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc\") pod \"2257267d-e6ea-45bb-b3c4-a30267fc9147\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.868854 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ncf8\" (UniqueName: \"kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8\") pod \"2257267d-e6ea-45bb-b3c4-a30267fc9147\" (UID: \"2257267d-e6ea-45bb-b3c4-a30267fc9147\") " Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.876227 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8" (OuterVolumeSpecName: "kube-api-access-6ncf8") pod "2257267d-e6ea-45bb-b3c4-a30267fc9147" (UID: "2257267d-e6ea-45bb-b3c4-a30267fc9147"). InnerVolumeSpecName "kube-api-access-6ncf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.882653 4929 generic.go:334] "Generic (PLEG): container finished" podID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerID="c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5" exitCode=0 Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.882704 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" event={"ID":"2257267d-e6ea-45bb-b3c4-a30267fc9147","Type":"ContainerDied","Data":"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5"} Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.882738 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" event={"ID":"2257267d-e6ea-45bb-b3c4-a30267fc9147","Type":"ContainerDied","Data":"c9fd5ad1d79f3a9e29f5346b4143f50aedef6ba82a40de408527753345b931de"} Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.882757 4929 scope.go:117] "RemoveContainer" containerID="c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.882893 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc869bb7-k6d6j" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.920717 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2257267d-e6ea-45bb-b3c4-a30267fc9147" (UID: "2257267d-e6ea-45bb-b3c4-a30267fc9147"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.923857 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2257267d-e6ea-45bb-b3c4-a30267fc9147" (UID: "2257267d-e6ea-45bb-b3c4-a30267fc9147"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.927491 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2257267d-e6ea-45bb-b3c4-a30267fc9147" (UID: "2257267d-e6ea-45bb-b3c4-a30267fc9147"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.941001 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config" (OuterVolumeSpecName: "config") pod "2257267d-e6ea-45bb-b3c4-a30267fc9147" (UID: "2257267d-e6ea-45bb-b3c4-a30267fc9147"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.972297 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.972348 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.972363 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ncf8\" (UniqueName: \"kubernetes.io/projected/2257267d-e6ea-45bb-b3c4-a30267fc9147-kube-api-access-6ncf8\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.972385 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.972399 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2257267d-e6ea-45bb-b3c4-a30267fc9147-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:51 crc kubenswrapper[4929]: I1002 12:45:51.985563 4929 scope.go:117] "RemoveContainer" containerID="00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752" Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.009249 4929 scope.go:117] "RemoveContainer" containerID="c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5" Oct 02 12:45:52 crc kubenswrapper[4929]: E1002 12:45:52.010476 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5\": container with ID starting with c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5 not found: ID does not exist" containerID="c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5" Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.010524 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5"} err="failed to get container status \"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5\": rpc error: code = NotFound desc = could not find container \"c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5\": container with ID starting with c7af72537acfb6ae101b3b5e6160752dce8e67b78a7afdb900a843d7961f12a5 not found: ID does not exist" Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.010556 4929 scope.go:117] "RemoveContainer" containerID="00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752" Oct 02 12:45:52 crc kubenswrapper[4929]: E1002 12:45:52.011491 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752\": container with ID starting with 00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752 not found: ID does not exist" containerID="00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752" Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.011518 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752"} err="failed to get container status \"00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752\": rpc error: code = NotFound desc = could not find container \"00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752\": container with ID starting with 00963cfa6b80561bd67b2caf8e01aadd080dbe3e6daf6c2325881e1b28c58752 not found: ID does not exist" Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.210930 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:45:52 crc kubenswrapper[4929]: I1002 12:45:52.217464 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc869bb7-k6d6j"] Oct 02 12:45:54 crc kubenswrapper[4929]: I1002 12:45:54.169665 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" path="/var/lib/kubelet/pods/2257267d-e6ea-45bb-b3c4-a30267fc9147/volumes" Oct 02 12:45:55 crc kubenswrapper[4929]: E1002 12:45:55.445341 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f52949_db23_4922_adf0_eb1122fe74a5.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.678913 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-f62zg"] Oct 02 12:45:55 crc kubenswrapper[4929]: E1002 12:45:55.679628 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="init" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.679653 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="init" Oct 02 12:45:55 crc kubenswrapper[4929]: E1002 12:45:55.679685 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="dnsmasq-dns" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.679695 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="dnsmasq-dns" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.680008 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2257267d-e6ea-45bb-b3c4-a30267fc9147" containerName="dnsmasq-dns" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.680828 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.691331 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f62zg"] Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.743952 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtv2x\" (UniqueName: \"kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x\") pod \"cinder-db-create-f62zg\" (UID: \"9390df5a-244b-456d-b23a-be1ddcac2bc5\") " pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.846261 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtv2x\" (UniqueName: \"kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x\") pod \"cinder-db-create-f62zg\" (UID: \"9390df5a-244b-456d-b23a-be1ddcac2bc5\") " pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:55 crc kubenswrapper[4929]: I1002 12:45:55.871676 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtv2x\" (UniqueName: \"kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x\") pod \"cinder-db-create-f62zg\" (UID: \"9390df5a-244b-456d-b23a-be1ddcac2bc5\") " pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:56 crc kubenswrapper[4929]: I1002 12:45:56.039505 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:56 crc kubenswrapper[4929]: W1002 12:45:56.522770 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9390df5a_244b_456d_b23a_be1ddcac2bc5.slice/crio-10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c WatchSource:0}: Error finding container 10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c: Status 404 returned error can't find the container with id 10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c Oct 02 12:45:56 crc kubenswrapper[4929]: I1002 12:45:56.528240 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f62zg"] Oct 02 12:45:56 crc kubenswrapper[4929]: I1002 12:45:56.937217 4929 generic.go:334] "Generic (PLEG): container finished" podID="9390df5a-244b-456d-b23a-be1ddcac2bc5" containerID="6f907189e14b13cde3faa31d2b31b248f1ebf717fbd175e8f6f3ef51a539c489" exitCode=0 Oct 02 12:45:56 crc kubenswrapper[4929]: I1002 12:45:56.937269 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f62zg" event={"ID":"9390df5a-244b-456d-b23a-be1ddcac2bc5","Type":"ContainerDied","Data":"6f907189e14b13cde3faa31d2b31b248f1ebf717fbd175e8f6f3ef51a539c489"} Oct 02 12:45:56 crc kubenswrapper[4929]: I1002 12:45:56.937299 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f62zg" event={"ID":"9390df5a-244b-456d-b23a-be1ddcac2bc5","Type":"ContainerStarted","Data":"10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c"} Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.285296 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f62zg" Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.405686 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtv2x\" (UniqueName: \"kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x\") pod \"9390df5a-244b-456d-b23a-be1ddcac2bc5\" (UID: \"9390df5a-244b-456d-b23a-be1ddcac2bc5\") " Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.410549 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x" (OuterVolumeSpecName: "kube-api-access-jtv2x") pod "9390df5a-244b-456d-b23a-be1ddcac2bc5" (UID: "9390df5a-244b-456d-b23a-be1ddcac2bc5"). InnerVolumeSpecName "kube-api-access-jtv2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.507409 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtv2x\" (UniqueName: \"kubernetes.io/projected/9390df5a-244b-456d-b23a-be1ddcac2bc5-kube-api-access-jtv2x\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.958413 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f62zg" event={"ID":"9390df5a-244b-456d-b23a-be1ddcac2bc5","Type":"ContainerDied","Data":"10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c"} Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.958467 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10237b84e3c00a60f935a70bcf8a526830c3952c0fc77507768e6eacfc6c137c" Oct 02 12:45:58 crc kubenswrapper[4929]: I1002 12:45:58.958554 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f62zg" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.801847 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-92b0-account-create-8sm79"] Oct 02 12:46:05 crc kubenswrapper[4929]: E1002 12:46:05.802830 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9390df5a-244b-456d-b23a-be1ddcac2bc5" containerName="mariadb-database-create" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.802846 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="9390df5a-244b-456d-b23a-be1ddcac2bc5" containerName="mariadb-database-create" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.803072 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="9390df5a-244b-456d-b23a-be1ddcac2bc5" containerName="mariadb-database-create" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.803763 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.805908 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.815006 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-92b0-account-create-8sm79"] Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.838465 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55clh\" (UniqueName: \"kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh\") pod \"cinder-92b0-account-create-8sm79\" (UID: \"69eb84ff-3ff3-4b30-955f-a5570beb8c31\") " pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.941132 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55clh\" (UniqueName: \"kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh\") pod \"cinder-92b0-account-create-8sm79\" (UID: \"69eb84ff-3ff3-4b30-955f-a5570beb8c31\") " pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:05 crc kubenswrapper[4929]: I1002 12:46:05.959202 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55clh\" (UniqueName: \"kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh\") pod \"cinder-92b0-account-create-8sm79\" (UID: \"69eb84ff-3ff3-4b30-955f-a5570beb8c31\") " pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:06 crc kubenswrapper[4929]: I1002 12:46:06.134935 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:06 crc kubenswrapper[4929]: I1002 12:46:06.683873 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-92b0-account-create-8sm79"] Oct 02 12:46:07 crc kubenswrapper[4929]: I1002 12:46:07.040027 4929 generic.go:334] "Generic (PLEG): container finished" podID="69eb84ff-3ff3-4b30-955f-a5570beb8c31" containerID="dd1eef42e5188ebdcf01a6cce1eaaf99a94695c98863bc7c5bcff1ea5492bbc3" exitCode=0 Oct 02 12:46:07 crc kubenswrapper[4929]: I1002 12:46:07.040250 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-92b0-account-create-8sm79" event={"ID":"69eb84ff-3ff3-4b30-955f-a5570beb8c31","Type":"ContainerDied","Data":"dd1eef42e5188ebdcf01a6cce1eaaf99a94695c98863bc7c5bcff1ea5492bbc3"} Oct 02 12:46:07 crc kubenswrapper[4929]: I1002 12:46:07.040313 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-92b0-account-create-8sm79" event={"ID":"69eb84ff-3ff3-4b30-955f-a5570beb8c31","Type":"ContainerStarted","Data":"6f8d694f7d1797a0c816cbe97e6ecaa1631f6af438e22711f63b9920afc9e728"} Oct 02 12:46:08 crc kubenswrapper[4929]: I1002 12:46:08.395944 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:08 crc kubenswrapper[4929]: I1002 12:46:08.491148 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55clh\" (UniqueName: \"kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh\") pod \"69eb84ff-3ff3-4b30-955f-a5570beb8c31\" (UID: \"69eb84ff-3ff3-4b30-955f-a5570beb8c31\") " Oct 02 12:46:08 crc kubenswrapper[4929]: I1002 12:46:08.497899 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh" (OuterVolumeSpecName: "kube-api-access-55clh") pod "69eb84ff-3ff3-4b30-955f-a5570beb8c31" (UID: "69eb84ff-3ff3-4b30-955f-a5570beb8c31"). InnerVolumeSpecName "kube-api-access-55clh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:08 crc kubenswrapper[4929]: I1002 12:46:08.593561 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55clh\" (UniqueName: \"kubernetes.io/projected/69eb84ff-3ff3-4b30-955f-a5570beb8c31-kube-api-access-55clh\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:09 crc kubenswrapper[4929]: I1002 12:46:09.059829 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-92b0-account-create-8sm79" event={"ID":"69eb84ff-3ff3-4b30-955f-a5570beb8c31","Type":"ContainerDied","Data":"6f8d694f7d1797a0c816cbe97e6ecaa1631f6af438e22711f63b9920afc9e728"} Oct 02 12:46:09 crc kubenswrapper[4929]: I1002 12:46:09.059869 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8d694f7d1797a0c816cbe97e6ecaa1631f6af438e22711f63b9920afc9e728" Oct 02 12:46:09 crc kubenswrapper[4929]: I1002 12:46:09.059885 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-92b0-account-create-8sm79" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.037637 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-bgxnp"] Oct 02 12:46:11 crc kubenswrapper[4929]: E1002 12:46:11.038128 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eb84ff-3ff3-4b30-955f-a5570beb8c31" containerName="mariadb-account-create" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.038142 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eb84ff-3ff3-4b30-955f-a5570beb8c31" containerName="mariadb-account-create" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.038329 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="69eb84ff-3ff3-4b30-955f-a5570beb8c31" containerName="mariadb-account-create" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.038987 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.040725 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fvxs6" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.041771 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.047881 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.055989 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bgxnp"] Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.135364 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.135442 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.135590 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.135875 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.135972 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.136023 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg9c9\" (UniqueName: \"kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.237897 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238015 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238084 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg9c9\" (UniqueName: \"kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238140 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238200 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.238588 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.244764 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.245008 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.245260 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.245985 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.258048 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg9c9\" (UniqueName: \"kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9\") pod \"cinder-db-sync-bgxnp\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.358075 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:11 crc kubenswrapper[4929]: I1002 12:46:11.806523 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-bgxnp"] Oct 02 12:46:12 crc kubenswrapper[4929]: I1002 12:46:12.099774 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bgxnp" event={"ID":"62690ece-7ef3-4ada-9ff4-9ed1d858fea6","Type":"ContainerStarted","Data":"67c40db2c7e96442352f02b71a3b6f470356a79b06bd0cd4cc2c36337427cae8"} Oct 02 12:46:13 crc kubenswrapper[4929]: I1002 12:46:13.110001 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bgxnp" event={"ID":"62690ece-7ef3-4ada-9ff4-9ed1d858fea6","Type":"ContainerStarted","Data":"f4d119c55679d5eb748d901a878c0c350f47bed375a2a3f5be759186bfdcbf15"} Oct 02 12:46:13 crc kubenswrapper[4929]: I1002 12:46:13.131345 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-bgxnp" podStartSLOduration=2.131325687 podStartE2EDuration="2.131325687s" podCreationTimestamp="2025-10-02 12:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:13.129455043 +0000 UTC m=+5773.679821407" watchObservedRunningTime="2025-10-02 12:46:13.131325687 +0000 UTC m=+5773.681692051" Oct 02 12:46:14 crc kubenswrapper[4929]: I1002 12:46:14.736861 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:46:14 crc kubenswrapper[4929]: I1002 12:46:14.736926 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:46:14 crc kubenswrapper[4929]: I1002 12:46:14.736997 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:46:14 crc kubenswrapper[4929]: I1002 12:46:14.737679 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:46:14 crc kubenswrapper[4929]: I1002 12:46:14.737749 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a" gracePeriod=600 Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.129146 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a" exitCode=0 Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.129222 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a"} Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.129537 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0"} Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.129563 4929 scope.go:117] "RemoveContainer" containerID="9f8cb5f0eea4de10c2e67cef8058e74230335385885ca8e1591f4bb3de1109cf" Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.131596 4929 generic.go:334] "Generic (PLEG): container finished" podID="62690ece-7ef3-4ada-9ff4-9ed1d858fea6" containerID="f4d119c55679d5eb748d901a878c0c350f47bed375a2a3f5be759186bfdcbf15" exitCode=0 Oct 02 12:46:15 crc kubenswrapper[4929]: I1002 12:46:15.131631 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bgxnp" event={"ID":"62690ece-7ef3-4ada-9ff4-9ed1d858fea6","Type":"ContainerDied","Data":"f4d119c55679d5eb748d901a878c0c350f47bed375a2a3f5be759186bfdcbf15"} Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.472106 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.632754 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.632823 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg9c9\" (UniqueName: \"kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.632879 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.632946 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.633091 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.633179 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data\") pod \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\" (UID: \"62690ece-7ef3-4ada-9ff4-9ed1d858fea6\") " Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.634131 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.639281 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts" (OuterVolumeSpecName: "scripts") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.643072 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.643231 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9" (OuterVolumeSpecName: "kube-api-access-bg9c9") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "kube-api-access-bg9c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.666056 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.690417 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data" (OuterVolumeSpecName: "config-data") pod "62690ece-7ef3-4ada-9ff4-9ed1d858fea6" (UID: "62690ece-7ef3-4ada-9ff4-9ed1d858fea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735162 4929 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735201 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735212 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735222 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg9c9\" (UniqueName: \"kubernetes.io/projected/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-kube-api-access-bg9c9\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735234 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:16 crc kubenswrapper[4929]: I1002 12:46:16.735242 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62690ece-7ef3-4ada-9ff4-9ed1d858fea6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.156624 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-bgxnp" event={"ID":"62690ece-7ef3-4ada-9ff4-9ed1d858fea6","Type":"ContainerDied","Data":"67c40db2c7e96442352f02b71a3b6f470356a79b06bd0cd4cc2c36337427cae8"} Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.156657 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67c40db2c7e96442352f02b71a3b6f470356a79b06bd0cd4cc2c36337427cae8" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.156702 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-bgxnp" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.544455 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:46:17 crc kubenswrapper[4929]: E1002 12:46:17.545027 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62690ece-7ef3-4ada-9ff4-9ed1d858fea6" containerName="cinder-db-sync" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.545044 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="62690ece-7ef3-4ada-9ff4-9ed1d858fea6" containerName="cinder-db-sync" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.545308 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="62690ece-7ef3-4ada-9ff4-9ed1d858fea6" containerName="cinder-db-sync" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.546606 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.577064 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.659193 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.659473 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.659552 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.659656 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.659757 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7pzv\" (UniqueName: \"kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.761573 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.762066 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.762120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.762203 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.762277 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7pzv\" (UniqueName: \"kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.762351 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.763148 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.763221 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.763526 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.763907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.764268 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.769341 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fvxs6" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.769673 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.769772 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.769711 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.781739 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.813690 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7pzv\" (UniqueName: \"kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv\") pod \"dnsmasq-dns-845dd45477-hpv7l\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.866395 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867490 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867525 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867588 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867629 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867675 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867696 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.867738 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q998\" (UniqueName: \"kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.969819 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.969914 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.969942 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.970006 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q998\" (UniqueName: \"kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.970066 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.970086 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.970155 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.971379 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.971755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.981672 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.987277 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.987835 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.988592 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:17 crc kubenswrapper[4929]: I1002 12:46:17.995663 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q998\" (UniqueName: \"kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998\") pod \"cinder-api-0\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " pod="openstack/cinder-api-0" Oct 02 12:46:18 crc kubenswrapper[4929]: I1002 12:46:18.090382 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:18 crc kubenswrapper[4929]: W1002 12:46:18.395843 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda583de89_e9f5_48d5_b1d4_b0fb2e53b992.slice/crio-d2d093ca8427026931bf72dafa0ce7b90a79ad664cb8c9989e8b0704a6099354 WatchSource:0}: Error finding container d2d093ca8427026931bf72dafa0ce7b90a79ad664cb8c9989e8b0704a6099354: Status 404 returned error can't find the container with id d2d093ca8427026931bf72dafa0ce7b90a79ad664cb8c9989e8b0704a6099354 Oct 02 12:46:18 crc kubenswrapper[4929]: I1002 12:46:18.397841 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:46:18 crc kubenswrapper[4929]: I1002 12:46:18.547270 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:19 crc kubenswrapper[4929]: I1002 12:46:19.248467 4929 generic.go:334] "Generic (PLEG): container finished" podID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerID="408a8aeac3418e7e6599d90c9c52646e1a0f45146765e2257eab73034d547c02" exitCode=0 Oct 02 12:46:19 crc kubenswrapper[4929]: I1002 12:46:19.248534 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" event={"ID":"a583de89-e9f5-48d5-b1d4-b0fb2e53b992","Type":"ContainerDied","Data":"408a8aeac3418e7e6599d90c9c52646e1a0f45146765e2257eab73034d547c02"} Oct 02 12:46:19 crc kubenswrapper[4929]: I1002 12:46:19.249121 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" event={"ID":"a583de89-e9f5-48d5-b1d4-b0fb2e53b992","Type":"ContainerStarted","Data":"d2d093ca8427026931bf72dafa0ce7b90a79ad664cb8c9989e8b0704a6099354"} Oct 02 12:46:19 crc kubenswrapper[4929]: I1002 12:46:19.253731 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerStarted","Data":"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240"} Oct 02 12:46:19 crc kubenswrapper[4929]: I1002 12:46:19.254078 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerStarted","Data":"489425f018f2cc6aa38fd5ed07454b25b1d947134f9e7f0a42d84b9e620da5fc"} Oct 02 12:46:20 crc kubenswrapper[4929]: I1002 12:46:20.263205 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" event={"ID":"a583de89-e9f5-48d5-b1d4-b0fb2e53b992","Type":"ContainerStarted","Data":"e45b9445689c4fcf44fd21aee8563add036056f6a912455e33afbb42cbe11d08"} Oct 02 12:46:20 crc kubenswrapper[4929]: I1002 12:46:20.263494 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:20 crc kubenswrapper[4929]: I1002 12:46:20.301357 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" podStartSLOduration=3.301337819 podStartE2EDuration="3.301337819s" podCreationTimestamp="2025-10-02 12:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:20.283261078 +0000 UTC m=+5780.833627462" watchObservedRunningTime="2025-10-02 12:46:20.301337819 +0000 UTC m=+5780.851704183" Oct 02 12:46:21 crc kubenswrapper[4929]: I1002 12:46:21.272196 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerStarted","Data":"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c"} Oct 02 12:46:21 crc kubenswrapper[4929]: I1002 12:46:21.272566 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 12:46:21 crc kubenswrapper[4929]: I1002 12:46:21.294712 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.294688719 podStartE2EDuration="4.294688719s" podCreationTimestamp="2025-10-02 12:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:21.286542274 +0000 UTC m=+5781.836908638" watchObservedRunningTime="2025-10-02 12:46:21.294688719 +0000 UTC m=+5781.845055083" Oct 02 12:46:27 crc kubenswrapper[4929]: I1002 12:46:27.868783 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:46:27 crc kubenswrapper[4929]: I1002 12:46:27.996946 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:46:27 crc kubenswrapper[4929]: I1002 12:46:27.997479 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="dnsmasq-dns" containerID="cri-o://a7914aba29d9d75a4cee3239e493b4ffd797bc451af1714c92534269b4f59d3c" gracePeriod=10 Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.350103 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerID="a7914aba29d9d75a4cee3239e493b4ffd797bc451af1714c92534269b4f59d3c" exitCode=0 Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.350464 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" event={"ID":"d1f53255-170f-4bcd-be6c-9b48995ab571","Type":"ContainerDied","Data":"a7914aba29d9d75a4cee3239e493b4ffd797bc451af1714c92534269b4f59d3c"} Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.491286 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.601935 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config\") pod \"d1f53255-170f-4bcd-be6c-9b48995ab571\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.602109 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb\") pod \"d1f53255-170f-4bcd-be6c-9b48995ab571\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.602153 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb\") pod \"d1f53255-170f-4bcd-be6c-9b48995ab571\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.602238 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hwsz\" (UniqueName: \"kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz\") pod \"d1f53255-170f-4bcd-be6c-9b48995ab571\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.602311 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc\") pod \"d1f53255-170f-4bcd-be6c-9b48995ab571\" (UID: \"d1f53255-170f-4bcd-be6c-9b48995ab571\") " Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.620274 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz" (OuterVolumeSpecName: "kube-api-access-7hwsz") pod "d1f53255-170f-4bcd-be6c-9b48995ab571" (UID: "d1f53255-170f-4bcd-be6c-9b48995ab571"). InnerVolumeSpecName "kube-api-access-7hwsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.651111 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d1f53255-170f-4bcd-be6c-9b48995ab571" (UID: "d1f53255-170f-4bcd-be6c-9b48995ab571"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.651471 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config" (OuterVolumeSpecName: "config") pod "d1f53255-170f-4bcd-be6c-9b48995ab571" (UID: "d1f53255-170f-4bcd-be6c-9b48995ab571"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.659376 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d1f53255-170f-4bcd-be6c-9b48995ab571" (UID: "d1f53255-170f-4bcd-be6c-9b48995ab571"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.672726 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1f53255-170f-4bcd-be6c-9b48995ab571" (UID: "d1f53255-170f-4bcd-be6c-9b48995ab571"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.704279 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hwsz\" (UniqueName: \"kubernetes.io/projected/d1f53255-170f-4bcd-be6c-9b48995ab571-kube-api-access-7hwsz\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.704317 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.704326 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.704337 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:28 crc kubenswrapper[4929]: I1002 12:46:28.704345 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d1f53255-170f-4bcd-be6c-9b48995ab571-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.362739 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" event={"ID":"d1f53255-170f-4bcd-be6c-9b48995ab571","Type":"ContainerDied","Data":"94f93174b702a4cf054abe57d1c5417102845dc33993b96d4b2e7e56168dbe1c"} Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.363135 4929 scope.go:117] "RemoveContainer" containerID="a7914aba29d9d75a4cee3239e493b4ffd797bc451af1714c92534269b4f59d3c" Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.363380 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759fbf6bbf-q8jqx" Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.387091 4929 scope.go:117] "RemoveContainer" containerID="18b0590ac5a308d3d430b9a0be30e2c95b3f0341050860c9f0d92afd83d6c2d4" Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.420794 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.432003 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759fbf6bbf-q8jqx"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.867011 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.867283 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" containerID="cri-o://7f9b67e5d7fce2e8c21ab8c05c80bf37b31fa8a4ddcdcccfa6ed95c0fbe1c226" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.867439 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" containerID="cri-o://ff3905119011c6ae9f0adbd8884839280509cab73d15f363fc97a964b1269082" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.891934 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.892195 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" containerName="nova-scheduler-scheduler" containerID="cri-o://2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.906681 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.906904 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c9374fb251f110b66c563d08602803d3c0a1fde424b53358990a6636c8d1e2f9" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.920055 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.920565 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="c5cd9082-d45d-4842-8c60-b75330997f59" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://964670ac51fc2bc9a5c92e8b605ecb92c96615496d64726fcd06a56bc90de38c" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.930845 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.931084 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-log" containerID="cri-o://a1f761566b9cf0369910eefb317521aee955169d792d415a49d58e4e3724d248" gracePeriod=30 Oct 02 12:46:29 crc kubenswrapper[4929]: I1002 12:46:29.931240 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-api" containerID="cri-o://84380993db879437c54629cc1e449cc7687f9263ae86ca84c03ed10446158707" gracePeriod=30 Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.068465 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.192724 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" path="/var/lib/kubelet/pods/d1f53255-170f-4bcd-be6c-9b48995ab571/volumes" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.375097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5cd9082-d45d-4842-8c60-b75330997f59","Type":"ContainerDied","Data":"964670ac51fc2bc9a5c92e8b605ecb92c96615496d64726fcd06a56bc90de38c"} Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.375161 4929 generic.go:334] "Generic (PLEG): container finished" podID="c5cd9082-d45d-4842-8c60-b75330997f59" containerID="964670ac51fc2bc9a5c92e8b605ecb92c96615496d64726fcd06a56bc90de38c" exitCode=0 Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.379030 4929 generic.go:334] "Generic (PLEG): container finished" podID="db37045c-cb4a-45f1-b530-a0da6442becd" containerID="7f9b67e5d7fce2e8c21ab8c05c80bf37b31fa8a4ddcdcccfa6ed95c0fbe1c226" exitCode=143 Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.379073 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerDied","Data":"7f9b67e5d7fce2e8c21ab8c05c80bf37b31fa8a4ddcdcccfa6ed95c0fbe1c226"} Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.385392 4929 generic.go:334] "Generic (PLEG): container finished" podID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerID="a1f761566b9cf0369910eefb317521aee955169d792d415a49d58e4e3724d248" exitCode=143 Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.385451 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerDied","Data":"a1f761566b9cf0369910eefb317521aee955169d792d415a49d58e4e3724d248"} Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.740585 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.892275 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66zw7\" (UniqueName: \"kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7\") pod \"c5cd9082-d45d-4842-8c60-b75330997f59\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.892530 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data\") pod \"c5cd9082-d45d-4842-8c60-b75330997f59\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.892584 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle\") pod \"c5cd9082-d45d-4842-8c60-b75330997f59\" (UID: \"c5cd9082-d45d-4842-8c60-b75330997f59\") " Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.913180 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7" (OuterVolumeSpecName: "kube-api-access-66zw7") pod "c5cd9082-d45d-4842-8c60-b75330997f59" (UID: "c5cd9082-d45d-4842-8c60-b75330997f59"). InnerVolumeSpecName "kube-api-access-66zw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.922214 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data" (OuterVolumeSpecName: "config-data") pod "c5cd9082-d45d-4842-8c60-b75330997f59" (UID: "c5cd9082-d45d-4842-8c60-b75330997f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.930093 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5cd9082-d45d-4842-8c60-b75330997f59" (UID: "c5cd9082-d45d-4842-8c60-b75330997f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.995098 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.995135 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66zw7\" (UniqueName: \"kubernetes.io/projected/c5cd9082-d45d-4842-8c60-b75330997f59-kube-api-access-66zw7\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:30 crc kubenswrapper[4929]: I1002 12:46:30.995146 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5cd9082-d45d-4842-8c60-b75330997f59-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.396846 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c5cd9082-d45d-4842-8c60-b75330997f59","Type":"ContainerDied","Data":"6e215cb18e45ccdd5b5dd555e72aad56b9a0a38eab3d5de3eac83b50ad85893f"} Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.396901 4929 scope.go:117] "RemoveContainer" containerID="964670ac51fc2bc9a5c92e8b605ecb92c96615496d64726fcd06a56bc90de38c" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.397043 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.399747 4929 generic.go:334] "Generic (PLEG): container finished" podID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" containerID="c9374fb251f110b66c563d08602803d3c0a1fde424b53358990a6636c8d1e2f9" exitCode=0 Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.399786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1fbd7f42-08ae-4572-96a1-0b74c7a63866","Type":"ContainerDied","Data":"c9374fb251f110b66c563d08602803d3c0a1fde424b53358990a6636c8d1e2f9"} Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.478643 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.482394 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.501301 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:31 crc kubenswrapper[4929]: E1002 12:46:31.523468 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cd9082-d45d-4842-8c60-b75330997f59" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.523506 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cd9082-d45d-4842-8c60-b75330997f59" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:46:31 crc kubenswrapper[4929]: E1002 12:46:31.523523 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="init" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.523530 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="init" Oct 02 12:46:31 crc kubenswrapper[4929]: E1002 12:46:31.523553 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="dnsmasq-dns" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.523561 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="dnsmasq-dns" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.524085 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f53255-170f-4bcd-be6c-9b48995ab571" containerName="dnsmasq-dns" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.524140 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cd9082-d45d-4842-8c60-b75330997f59" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.525073 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.525171 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.536157 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.615356 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.615416 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jct8\" (UniqueName: \"kubernetes.io/projected/68107bd7-f800-4101-963b-24050976ae71-kube-api-access-4jct8\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.615489 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.677268 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.718553 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.720018 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.721189 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jct8\" (UniqueName: \"kubernetes.io/projected/68107bd7-f800-4101-963b-24050976ae71-kube-api-access-4jct8\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.725063 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.742891 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jct8\" (UniqueName: \"kubernetes.io/projected/68107bd7-f800-4101-963b-24050976ae71-kube-api-access-4jct8\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.743886 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68107bd7-f800-4101-963b-24050976ae71-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"68107bd7-f800-4101-963b-24050976ae71\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.823368 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle\") pod \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.824016 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data\") pod \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.824160 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572fb\" (UniqueName: \"kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb\") pod \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\" (UID: \"1fbd7f42-08ae-4572-96a1-0b74c7a63866\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.828905 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb" (OuterVolumeSpecName: "kube-api-access-572fb") pod "1fbd7f42-08ae-4572-96a1-0b74c7a63866" (UID: "1fbd7f42-08ae-4572-96a1-0b74c7a63866"). InnerVolumeSpecName "kube-api-access-572fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.834773 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.855087 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data" (OuterVolumeSpecName: "config-data") pod "1fbd7f42-08ae-4572-96a1-0b74c7a63866" (UID: "1fbd7f42-08ae-4572-96a1-0b74c7a63866"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.862504 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fbd7f42-08ae-4572-96a1-0b74c7a63866" (UID: "1fbd7f42-08ae-4572-96a1-0b74c7a63866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.925743 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle\") pod \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.925847 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data\") pod \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.925885 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghn6d\" (UniqueName: \"kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d\") pod \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\" (UID: \"2e0cbaa7-a006-494f-baa9-a13306e8edb4\") " Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.926348 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.926368 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbd7f42-08ae-4572-96a1-0b74c7a63866-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.926421 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572fb\" (UniqueName: \"kubernetes.io/projected/1fbd7f42-08ae-4572-96a1-0b74c7a63866-kube-api-access-572fb\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.928674 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d" (OuterVolumeSpecName: "kube-api-access-ghn6d") pod "2e0cbaa7-a006-494f-baa9-a13306e8edb4" (UID: "2e0cbaa7-a006-494f-baa9-a13306e8edb4"). InnerVolumeSpecName "kube-api-access-ghn6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.950393 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e0cbaa7-a006-494f-baa9-a13306e8edb4" (UID: "2e0cbaa7-a006-494f-baa9-a13306e8edb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.954109 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data" (OuterVolumeSpecName: "config-data") pod "2e0cbaa7-a006-494f-baa9-a13306e8edb4" (UID: "2e0cbaa7-a006-494f-baa9-a13306e8edb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:31 crc kubenswrapper[4929]: I1002 12:46:31.972819 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.028414 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghn6d\" (UniqueName: \"kubernetes.io/projected/2e0cbaa7-a006-494f-baa9-a13306e8edb4-kube-api-access-ghn6d\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.028460 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.028471 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e0cbaa7-a006-494f-baa9-a13306e8edb4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.176406 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cd9082-d45d-4842-8c60-b75330997f59" path="/var/lib/kubelet/pods/c5cd9082-d45d-4842-8c60-b75330997f59/volumes" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.416851 4929 generic.go:334] "Generic (PLEG): container finished" podID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" containerID="2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3" exitCode=0 Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.417340 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.417485 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e0cbaa7-a006-494f-baa9-a13306e8edb4","Type":"ContainerDied","Data":"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3"} Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.417534 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e0cbaa7-a006-494f-baa9-a13306e8edb4","Type":"ContainerDied","Data":"88aaac89aed59628719b23c004f626be01c2dd163e668df55c196021d9fa2717"} Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.417554 4929 scope.go:117] "RemoveContainer" containerID="2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.423609 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1fbd7f42-08ae-4572-96a1-0b74c7a63866","Type":"ContainerDied","Data":"c60eed098c69f08c70d7bd502267a34fc6203bbf01f6a976c9b377990ea97033"} Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.423719 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.467244 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.485272 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.489387 4929 scope.go:117] "RemoveContainer" containerID="2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3" Oct 02 12:46:32 crc kubenswrapper[4929]: E1002 12:46:32.490006 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3\": container with ID starting with 2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3 not found: ID does not exist" containerID="2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.490056 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3"} err="failed to get container status \"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3\": rpc error: code = NotFound desc = could not find container \"2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3\": container with ID starting with 2abb75887da7396748c177c8e45e967d2af166a66658bf4c276dd5d5bdb0e0f3 not found: ID does not exist" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.490090 4929 scope.go:117] "RemoveContainer" containerID="c9374fb251f110b66c563d08602803d3c0a1fde424b53358990a6636c8d1e2f9" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.506013 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.522912 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.544137 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.559826 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: E1002 12:46:32.560603 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" containerName="nova-cell0-conductor-conductor" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.560627 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" containerName="nova-cell0-conductor-conductor" Oct 02 12:46:32 crc kubenswrapper[4929]: E1002 12:46:32.560683 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" containerName="nova-scheduler-scheduler" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.560693 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" containerName="nova-scheduler-scheduler" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.560981 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" containerName="nova-cell0-conductor-conductor" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.561012 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" containerName="nova-scheduler-scheduler" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.563121 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.565762 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.569033 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.570992 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.572343 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.578911 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.588102 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.645087 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.645195 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jsg\" (UniqueName: \"kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.645228 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.747347 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jsg\" (UniqueName: \"kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.748521 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.750051 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.750097 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.750468 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.750517 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4kv\" (UniqueName: \"kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.755319 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.755634 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.763866 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jsg\" (UniqueName: \"kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg\") pod \"nova-cell0-conductor-0\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.852635 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.852679 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.852760 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4kv\" (UniqueName: \"kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.857759 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.857888 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.870939 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4kv\" (UniqueName: \"kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv\") pod \"nova-scheduler-0\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " pod="openstack/nova-scheduler-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.969192 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:32 crc kubenswrapper[4929]: I1002 12:46:32.992244 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.007850 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:40272->10.217.1.73:8775: read: connection reset by peer" Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.008181 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:40270->10.217.1.73:8775: read: connection reset by peer" Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.151884 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.152450 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerName="nova-cell1-conductor-conductor" containerID="cri-o://703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" gracePeriod=30 Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.464122 4929 generic.go:334] "Generic (PLEG): container finished" podID="db37045c-cb4a-45f1-b530-a0da6442becd" containerID="ff3905119011c6ae9f0adbd8884839280509cab73d15f363fc97a964b1269082" exitCode=0 Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.464213 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerDied","Data":"ff3905119011c6ae9f0adbd8884839280509cab73d15f363fc97a964b1269082"} Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.466251 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"68107bd7-f800-4101-963b-24050976ae71","Type":"ContainerStarted","Data":"257de1f59e5859796d7948bed31efcd00008294417a1e3364ec74660999298a9"} Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.466301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"68107bd7-f800-4101-963b-24050976ae71","Type":"ContainerStarted","Data":"56641f751e0303cd8bdcffb42b900690dc98f98f5cbf5ab4cf0170846212c8c9"} Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.470787 4929 generic.go:334] "Generic (PLEG): container finished" podID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerID="84380993db879437c54629cc1e449cc7687f9263ae86ca84c03ed10446158707" exitCode=0 Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.470834 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerDied","Data":"84380993db879437c54629cc1e449cc7687f9263ae86ca84c03ed10446158707"} Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.488337 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.488315658 podStartE2EDuration="2.488315658s" podCreationTimestamp="2025-10-02 12:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:33.485109566 +0000 UTC m=+5794.035475930" watchObservedRunningTime="2025-10-02 12:46:33.488315658 +0000 UTC m=+5794.038682022" Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.765530 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:46:33 crc kubenswrapper[4929]: W1002 12:46:33.772781 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e8b15f_d208_40d8_884c_3a80b25bbfcb.slice/crio-789890918cc796bfc8d773cbb8288b01e6c832e7a4a66ce1bb0f261690d63d50 WatchSource:0}: Error finding container 789890918cc796bfc8d773cbb8288b01e6c832e7a4a66ce1bb0f261690d63d50: Status 404 returned error can't find the container with id 789890918cc796bfc8d773cbb8288b01e6c832e7a4a66ce1bb0f261690d63d50 Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.780906 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:46:33 crc kubenswrapper[4929]: W1002 12:46:33.792121 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2eabde0_bfe0_456f_a226_8eb41402dd41.slice/crio-413c6c9b67f96226467a77d33005540529da63f095758d7ab567973179bad380 WatchSource:0}: Error finding container 413c6c9b67f96226467a77d33005540529da63f095758d7ab567973179bad380: Status 404 returned error can't find the container with id 413c6c9b67f96226467a77d33005540529da63f095758d7ab567973179bad380 Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.897337 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:46:33 crc kubenswrapper[4929]: I1002 12:46:33.930135 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010159 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs\") pod \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010246 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data\") pod \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010283 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle\") pod \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010333 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24k2g\" (UniqueName: \"kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g\") pod \"db37045c-cb4a-45f1-b530-a0da6442becd\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010545 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs\") pod \"db37045c-cb4a-45f1-b530-a0da6442becd\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010602 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data\") pod \"db37045c-cb4a-45f1-b530-a0da6442becd\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010653 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92rpr\" (UniqueName: \"kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr\") pod \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\" (UID: \"e3e86650-6dc6-42d4-9c1d-879f111ff9b2\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.010783 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle\") pod \"db37045c-cb4a-45f1-b530-a0da6442becd\" (UID: \"db37045c-cb4a-45f1-b530-a0da6442becd\") " Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.012330 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs" (OuterVolumeSpecName: "logs") pod "db37045c-cb4a-45f1-b530-a0da6442becd" (UID: "db37045c-cb4a-45f1-b530-a0da6442becd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.012311 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs" (OuterVolumeSpecName: "logs") pod "e3e86650-6dc6-42d4-9c1d-879f111ff9b2" (UID: "e3e86650-6dc6-42d4-9c1d-879f111ff9b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.023675 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g" (OuterVolumeSpecName: "kube-api-access-24k2g") pod "db37045c-cb4a-45f1-b530-a0da6442becd" (UID: "db37045c-cb4a-45f1-b530-a0da6442becd"). InnerVolumeSpecName "kube-api-access-24k2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.042029 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr" (OuterVolumeSpecName: "kube-api-access-92rpr") pod "e3e86650-6dc6-42d4-9c1d-879f111ff9b2" (UID: "e3e86650-6dc6-42d4-9c1d-879f111ff9b2"). InnerVolumeSpecName "kube-api-access-92rpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.075159 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3e86650-6dc6-42d4-9c1d-879f111ff9b2" (UID: "e3e86650-6dc6-42d4-9c1d-879f111ff9b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.076214 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data" (OuterVolumeSpecName: "config-data") pod "db37045c-cb4a-45f1-b530-a0da6442becd" (UID: "db37045c-cb4a-45f1-b530-a0da6442becd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.076805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db37045c-cb4a-45f1-b530-a0da6442becd" (UID: "db37045c-cb4a-45f1-b530-a0da6442becd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.086101 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data" (OuterVolumeSpecName: "config-data") pod "e3e86650-6dc6-42d4-9c1d-879f111ff9b2" (UID: "e3e86650-6dc6-42d4-9c1d-879f111ff9b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113055 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db37045c-cb4a-45f1-b530-a0da6442becd-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113304 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113383 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92rpr\" (UniqueName: \"kubernetes.io/projected/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-kube-api-access-92rpr\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113461 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db37045c-cb4a-45f1-b530-a0da6442becd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113575 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113661 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113742 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e86650-6dc6-42d4-9c1d-879f111ff9b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.113822 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24k2g\" (UniqueName: \"kubernetes.io/projected/db37045c-cb4a-45f1-b530-a0da6442becd-kube-api-access-24k2g\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.172247 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fbd7f42-08ae-4572-96a1-0b74c7a63866" path="/var/lib/kubelet/pods/1fbd7f42-08ae-4572-96a1-0b74c7a63866/volumes" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.172901 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0cbaa7-a006-494f-baa9-a13306e8edb4" path="/var/lib/kubelet/pods/2e0cbaa7-a006-494f-baa9-a13306e8edb4/volumes" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.485437 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2eabde0-bfe0-456f-a226-8eb41402dd41","Type":"ContainerStarted","Data":"aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.485781 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2eabde0-bfe0-456f-a226-8eb41402dd41","Type":"ContainerStarted","Data":"413c6c9b67f96226467a77d33005540529da63f095758d7ab567973179bad380"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.487291 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.491654 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db37045c-cb4a-45f1-b530-a0da6442becd","Type":"ContainerDied","Data":"c0470a5ac14e8fa175d7ed2f8c4cc24371020ef26dfd9d63325aa4a8c11512b3"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.491738 4929 scope.go:117] "RemoveContainer" containerID="ff3905119011c6ae9f0adbd8884839280509cab73d15f363fc97a964b1269082" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.491798 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.497262 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1e8b15f-d208-40d8-884c-3a80b25bbfcb","Type":"ContainerStarted","Data":"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.497309 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1e8b15f-d208-40d8-884c-3a80b25bbfcb","Type":"ContainerStarted","Data":"789890918cc796bfc8d773cbb8288b01e6c832e7a4a66ce1bb0f261690d63d50"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.500567 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.501107 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e86650-6dc6-42d4-9c1d-879f111ff9b2","Type":"ContainerDied","Data":"d396728734ac862ab3ab7da2365b8c8e455f57d1356dfa604d03c7e04292c137"} Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.518699 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.518681115 podStartE2EDuration="2.518681115s" podCreationTimestamp="2025-10-02 12:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:34.509313054 +0000 UTC m=+5795.059679438" watchObservedRunningTime="2025-10-02 12:46:34.518681115 +0000 UTC m=+5795.069047479" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.558030 4929 scope.go:117] "RemoveContainer" containerID="7f9b67e5d7fce2e8c21ab8c05c80bf37b31fa8a4ddcdcccfa6ed95c0fbe1c226" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.567796 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.584500 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.596291 4929 scope.go:117] "RemoveContainer" containerID="84380993db879437c54629cc1e449cc7687f9263ae86ca84c03ed10446158707" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.602304 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: E1002 12:46:34.602771 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.602795 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" Oct 02 12:46:34 crc kubenswrapper[4929]: E1002 12:46:34.602824 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.602832 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" Oct 02 12:46:34 crc kubenswrapper[4929]: E1002 12:46:34.602845 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-api" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.602853 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-api" Oct 02 12:46:34 crc kubenswrapper[4929]: E1002 12:46:34.602872 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-log" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.602880 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-log" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.603064 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-log" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.603076 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-api" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.603096 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" containerName="nova-api-log" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.603106 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" containerName="nova-metadata-metadata" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.604204 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.608794 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.617594 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.627469 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.627846 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.628036 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.628105 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22cwq\" (UniqueName: \"kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.645520 4929 scope.go:117] "RemoveContainer" containerID="a1f761566b9cf0369910eefb317521aee955169d792d415a49d58e4e3724d248" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.646447 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.646422519 podStartE2EDuration="2.646422519s" podCreationTimestamp="2025-10-02 12:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:34.579993883 +0000 UTC m=+5795.130360257" watchObservedRunningTime="2025-10-02 12:46:34.646422519 +0000 UTC m=+5795.196788883" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.659471 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.675031 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.681014 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.684137 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.687985 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.692100 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.732866 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.732939 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22cwq\" (UniqueName: \"kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733003 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733048 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clpsr\" (UniqueName: \"kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733076 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733109 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733147 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.733205 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.735656 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.739853 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.741393 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.755873 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22cwq\" (UniqueName: \"kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq\") pod \"nova-metadata-0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " pod="openstack/nova-metadata-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.834733 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.834817 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clpsr\" (UniqueName: \"kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.834846 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.834910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.835401 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.841087 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.852800 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.853450 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clpsr\" (UniqueName: \"kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr\") pod \"nova-api-0\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " pod="openstack/nova-api-0" Oct 02 12:46:34 crc kubenswrapper[4929]: I1002 12:46:34.946852 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:46:35 crc kubenswrapper[4929]: I1002 12:46:35.005490 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:46:35 crc kubenswrapper[4929]: I1002 12:46:35.429821 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:46:35 crc kubenswrapper[4929]: W1002 12:46:35.434137 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33f6d9fc_e75a_4cb6_a760_bcef63340ba0.slice/crio-7f4ae9a158290492b9867e814282ef3774c3829d94a116b84855ad6c36b7f97d WatchSource:0}: Error finding container 7f4ae9a158290492b9867e814282ef3774c3829d94a116b84855ad6c36b7f97d: Status 404 returned error can't find the container with id 7f4ae9a158290492b9867e814282ef3774c3829d94a116b84855ad6c36b7f97d Oct 02 12:46:35 crc kubenswrapper[4929]: I1002 12:46:35.512035 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerStarted","Data":"7f4ae9a158290492b9867e814282ef3774c3829d94a116b84855ad6c36b7f97d"} Oct 02 12:46:35 crc kubenswrapper[4929]: I1002 12:46:35.544639 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.168159 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db37045c-cb4a-45f1-b530-a0da6442becd" path="/var/lib/kubelet/pods/db37045c-cb4a-45f1-b530-a0da6442becd/volumes" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.169328 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e86650-6dc6-42d4-9c1d-879f111ff9b2" path="/var/lib/kubelet/pods/e3e86650-6dc6-42d4-9c1d-879f111ff9b2/volumes" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.536134 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerStarted","Data":"04d9f919abb6004df897eed30ca7db6741b0be4fa6e87201ab832dfa31f16088"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.536685 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerStarted","Data":"4e323859df312ddcfa507ba095441a7a07d40e00202515d1ce964df782bb0cac"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.536701 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerStarted","Data":"4f59cda9d8055dc2bccf6d37f1da2627f1e4dfe49decbe13c481ef036b353386"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.541468 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerStarted","Data":"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.541537 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerStarted","Data":"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.543657 4929 generic.go:334] "Generic (PLEG): container finished" podID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerID="703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" exitCode=0 Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.543746 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e94be396-9a29-47d2-9f2a-ff9ab7a08605","Type":"ContainerDied","Data":"703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa"} Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.567620 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.567597008 podStartE2EDuration="2.567597008s" podCreationTimestamp="2025-10-02 12:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:36.556232481 +0000 UTC m=+5797.106598845" watchObservedRunningTime="2025-10-02 12:46:36.567597008 +0000 UTC m=+5797.117963372" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.601428 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6014071039999997 podStartE2EDuration="2.601407104s" podCreationTimestamp="2025-10-02 12:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:36.590332674 +0000 UTC m=+5797.140699038" watchObservedRunningTime="2025-10-02 12:46:36.601407104 +0000 UTC m=+5797.151773478" Oct 02 12:46:36 crc kubenswrapper[4929]: E1002 12:46:36.609028 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa is running failed: container process not found" containerID="703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:46:36 crc kubenswrapper[4929]: E1002 12:46:36.609293 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa is running failed: container process not found" containerID="703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:46:36 crc kubenswrapper[4929]: E1002 12:46:36.609494 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa is running failed: container process not found" containerID="703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:46:36 crc kubenswrapper[4929]: E1002 12:46:36.609522 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerName="nova-cell1-conductor-conductor" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.740078 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.781145 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle\") pod \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.781217 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data\") pod \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.781492 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmv8\" (UniqueName: \"kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8\") pod \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\" (UID: \"e94be396-9a29-47d2-9f2a-ff9ab7a08605\") " Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.796201 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8" (OuterVolumeSpecName: "kube-api-access-2qmv8") pod "e94be396-9a29-47d2-9f2a-ff9ab7a08605" (UID: "e94be396-9a29-47d2-9f2a-ff9ab7a08605"). InnerVolumeSpecName "kube-api-access-2qmv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.807335 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94be396-9a29-47d2-9f2a-ff9ab7a08605" (UID: "e94be396-9a29-47d2-9f2a-ff9ab7a08605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.809621 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data" (OuterVolumeSpecName: "config-data") pod "e94be396-9a29-47d2-9f2a-ff9ab7a08605" (UID: "e94be396-9a29-47d2-9f2a-ff9ab7a08605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.884222 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmv8\" (UniqueName: \"kubernetes.io/projected/e94be396-9a29-47d2-9f2a-ff9ab7a08605-kube-api-access-2qmv8\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.884257 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.884267 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94be396-9a29-47d2-9f2a-ff9ab7a08605-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:36 crc kubenswrapper[4929]: I1002 12:46:36.973888 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.557974 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e94be396-9a29-47d2-9f2a-ff9ab7a08605","Type":"ContainerDied","Data":"c98d29ebfdccc0104c3a71ae10ebb29dcd56f296660b60d1be54e37bb34ee245"} Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.558048 4929 scope.go:117] "RemoveContainer" containerID="703c61079aff1750a2da171df8d92f7d6ddf39346404c5d06cc279132453fffa" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.558088 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.604609 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.623481 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.646618 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:37 crc kubenswrapper[4929]: E1002 12:46:37.647311 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerName="nova-cell1-conductor-conductor" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.647327 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerName="nova-cell1-conductor-conductor" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.647512 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" containerName="nova-cell1-conductor-conductor" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.648174 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.650642 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.652334 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.696766 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk7f7\" (UniqueName: \"kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.696867 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.697006 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.798479 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.798611 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk7f7\" (UniqueName: \"kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.798651 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.804094 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.804250 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.815401 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk7f7\" (UniqueName: \"kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7\") pod \"nova-cell1-conductor-0\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.968472 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:37 crc kubenswrapper[4929]: I1002 12:46:37.992451 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:46:38 crc kubenswrapper[4929]: I1002 12:46:38.169054 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94be396-9a29-47d2-9f2a-ff9ab7a08605" path="/var/lib/kubelet/pods/e94be396-9a29-47d2-9f2a-ff9ab7a08605/volumes" Oct 02 12:46:38 crc kubenswrapper[4929]: I1002 12:46:38.413194 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:46:38 crc kubenswrapper[4929]: W1002 12:46:38.413758 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad8f9734_246a_44d2_8538_71b18a0ccabc.slice/crio-c001dfc6e45118c1b51f95fa473e3ebd546e7a0f6d38b726f7f939cf215f09f1 WatchSource:0}: Error finding container c001dfc6e45118c1b51f95fa473e3ebd546e7a0f6d38b726f7f939cf215f09f1: Status 404 returned error can't find the container with id c001dfc6e45118c1b51f95fa473e3ebd546e7a0f6d38b726f7f939cf215f09f1 Oct 02 12:46:38 crc kubenswrapper[4929]: I1002 12:46:38.573773 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ad8f9734-246a-44d2-8538-71b18a0ccabc","Type":"ContainerStarted","Data":"c001dfc6e45118c1b51f95fa473e3ebd546e7a0f6d38b726f7f939cf215f09f1"} Oct 02 12:46:39 crc kubenswrapper[4929]: I1002 12:46:39.584820 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ad8f9734-246a-44d2-8538-71b18a0ccabc","Type":"ContainerStarted","Data":"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69"} Oct 02 12:46:39 crc kubenswrapper[4929]: I1002 12:46:39.585169 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:39 crc kubenswrapper[4929]: I1002 12:46:39.603211 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.6031939790000003 podStartE2EDuration="2.603193979s" podCreationTimestamp="2025-10-02 12:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:39.598121762 +0000 UTC m=+5800.148488136" watchObservedRunningTime="2025-10-02 12:46:39.603193979 +0000 UTC m=+5800.153560343" Oct 02 12:46:39 crc kubenswrapper[4929]: I1002 12:46:39.947150 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:46:39 crc kubenswrapper[4929]: I1002 12:46:39.947271 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:46:41 crc kubenswrapper[4929]: I1002 12:46:41.973265 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:41 crc kubenswrapper[4929]: I1002 12:46:41.982881 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:42 crc kubenswrapper[4929]: I1002 12:46:42.622725 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:46:42 crc kubenswrapper[4929]: I1002 12:46:42.993590 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:46:42 crc kubenswrapper[4929]: I1002 12:46:42.995702 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 12:46:43 crc kubenswrapper[4929]: I1002 12:46:43.029612 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:46:43 crc kubenswrapper[4929]: I1002 12:46:43.647716 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:46:44 crc kubenswrapper[4929]: I1002 12:46:44.947783 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:46:44 crc kubenswrapper[4929]: I1002 12:46:44.948085 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:46:45 crc kubenswrapper[4929]: I1002 12:46:45.007006 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:46:45 crc kubenswrapper[4929]: I1002 12:46:45.007094 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:46:46 crc kubenswrapper[4929]: I1002 12:46:46.030393 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:46:46 crc kubenswrapper[4929]: I1002 12:46:46.030416 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:46:46 crc kubenswrapper[4929]: I1002 12:46:46.112399 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.85:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:46:46 crc kubenswrapper[4929]: I1002 12:46:46.112386 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.85:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:46:48 crc kubenswrapper[4929]: I1002 12:46:47.999602 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.531760 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.533351 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.535986 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.578044 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.708540 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.708812 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.708947 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.709056 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.709145 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvgp6\" (UniqueName: \"kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.709299 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811051 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811110 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811210 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvgp6\" (UniqueName: \"kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811276 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811305 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811327 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.811629 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.817547 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.818945 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.820847 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.825028 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.831361 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvgp6\" (UniqueName: \"kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6\") pod \"cinder-scheduler-0\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " pod="openstack/cinder-scheduler-0" Oct 02 12:46:49 crc kubenswrapper[4929]: I1002 12:46:49.879887 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:46:50 crc kubenswrapper[4929]: I1002 12:46:50.309615 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:46:50 crc kubenswrapper[4929]: W1002 12:46:50.319608 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcef2a42b_e101_4ae6_8161_173fc67300c6.slice/crio-a77b4d4e47bb24e2f2fccfcb9e92c29eca5d94ae3794ba9872627a16047968b8 WatchSource:0}: Error finding container a77b4d4e47bb24e2f2fccfcb9e92c29eca5d94ae3794ba9872627a16047968b8: Status 404 returned error can't find the container with id a77b4d4e47bb24e2f2fccfcb9e92c29eca5d94ae3794ba9872627a16047968b8 Oct 02 12:46:50 crc kubenswrapper[4929]: I1002 12:46:50.681178 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerStarted","Data":"a77b4d4e47bb24e2f2fccfcb9e92c29eca5d94ae3794ba9872627a16047968b8"} Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.030378 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.030972 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api-log" containerID="cri-o://6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240" gracePeriod=30 Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.031041 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api" containerID="cri-o://45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c" gracePeriod=30 Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.577703 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.579932 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.582131 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.604446 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.692341 4929 generic.go:334] "Generic (PLEG): container finished" podID="d9d22557-1d53-4138-93c7-662c89983eda" containerID="6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240" exitCode=143 Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.692430 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerDied","Data":"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240"} Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.694665 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerStarted","Data":"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5"} Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.694695 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerStarted","Data":"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e"} Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.717674 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.717654915 podStartE2EDuration="2.717654915s" podCreationTimestamp="2025-10-02 12:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:51.714056433 +0000 UTC m=+5812.264422807" watchObservedRunningTime="2025-10-02 12:46:51.717654915 +0000 UTC m=+5812.268021279" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748428 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748495 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748533 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748570 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748592 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748622 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748646 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlldr\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-kube-api-access-dlldr\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748670 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748692 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748744 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748772 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748827 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748881 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-run\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.748934 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850783 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850855 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850905 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850922 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850938 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.850985 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-sys\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851101 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851144 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851195 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-dev\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851199 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851388 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlldr\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-kube-api-access-dlldr\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851488 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851525 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851654 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851782 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.851930 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852051 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852313 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852410 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852447 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852482 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852513 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-run\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852549 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852709 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852742 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.852782 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/682cda70-91b9-44fc-b6e6-364a397c6de8-run\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.855917 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.856336 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.856632 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.857602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.868476 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/682cda70-91b9-44fc-b6e6-364a397c6de8-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.870657 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlldr\" (UniqueName: \"kubernetes.io/projected/682cda70-91b9-44fc-b6e6-364a397c6de8-kube-api-access-dlldr\") pod \"cinder-volume-volume1-0\" (UID: \"682cda70-91b9-44fc-b6e6-364a397c6de8\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:51 crc kubenswrapper[4929]: I1002 12:46:51.900023 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.187590 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.208143 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.216411 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.221555 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362225 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362518 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362544 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmdh\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-kube-api-access-xwmdh\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362577 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362593 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-dev\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362617 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-ceph\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362773 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-sys\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362831 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362920 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-lib-modules\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.362974 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363147 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363185 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-run\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363314 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-scripts\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363379 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363442 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.363472 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.449144 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.451798 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.465768 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.465828 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-run\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.465892 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-scripts\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.465932 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.465976 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466003 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466047 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466068 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466094 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmdh\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-kube-api-access-xwmdh\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466129 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466151 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-dev\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466199 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-ceph\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466231 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-sys\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466250 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466279 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-lib-modules\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466309 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.466409 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467717 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467763 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-run\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467793 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467812 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467842 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-sys\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467847 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.467874 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-lib-modules\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.470118 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.470411 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-dev\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.472011 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-ceph\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.472540 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.473004 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.474175 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-scripts\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.474639 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.487312 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmdh\" (UniqueName: \"kubernetes.io/projected/e3514ec2-d436-4bdb-8d89-ffb37157ac2d-kube-api-access-xwmdh\") pod \"cinder-backup-0\" (UID: \"e3514ec2-d436-4bdb-8d89-ffb37157ac2d\") " pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.531024 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 12:46:52 crc kubenswrapper[4929]: I1002 12:46:52.707582 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"682cda70-91b9-44fc-b6e6-364a397c6de8","Type":"ContainerStarted","Data":"193f951e2199688313194c5db6d36be676b1a1885c938557ca3aff56c43ff9b8"} Oct 02 12:46:53 crc kubenswrapper[4929]: I1002 12:46:53.050881 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:46:53 crc kubenswrapper[4929]: I1002 12:46:53.717863 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e3514ec2-d436-4bdb-8d89-ffb37157ac2d","Type":"ContainerStarted","Data":"96e447c6d4866501867d2cd01e7a3e5855dec4ea67e5df40f66094d3e416d7a7"} Oct 02 12:46:53 crc kubenswrapper[4929]: I1002 12:46:53.722247 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"682cda70-91b9-44fc-b6e6-364a397c6de8","Type":"ContainerStarted","Data":"e23aef803ea18b83deb906c59bb9fc3584365b5dc8d793666f60ba0d56b21241"} Oct 02 12:46:53 crc kubenswrapper[4929]: I1002 12:46:53.722331 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"682cda70-91b9-44fc-b6e6-364a397c6de8","Type":"ContainerStarted","Data":"04e12f0d956ce86dff9e0d6a37f1d3eacd2aaf76f0e34d6fd917999a93b28544"} Oct 02 12:46:53 crc kubenswrapper[4929]: I1002 12:46:53.753082 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.911564665 podStartE2EDuration="2.753065569s" podCreationTimestamp="2025-10-02 12:46:51 +0000 UTC" firstStartedPulling="2025-10-02 12:46:52.448830023 +0000 UTC m=+5812.999196387" lastFinishedPulling="2025-10-02 12:46:53.290330927 +0000 UTC m=+5813.840697291" observedRunningTime="2025-10-02 12:46:53.745359931 +0000 UTC m=+5814.295726305" watchObservedRunningTime="2025-10-02 12:46:53.753065569 +0000 UTC m=+5814.303431933" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.180259 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.80:8776/healthcheck\": read tcp 10.217.0.2:33180->10.217.1.80:8776: read: connection reset by peer" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.529894 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713576 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713810 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713847 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713868 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713892 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713913 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q998\" (UniqueName: \"kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.713933 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts\") pod \"d9d22557-1d53-4138-93c7-662c89983eda\" (UID: \"d9d22557-1d53-4138-93c7-662c89983eda\") " Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.714157 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs" (OuterVolumeSpecName: "logs") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.714252 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.714649 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d22557-1d53-4138-93c7-662c89983eda-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.714672 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9d22557-1d53-4138-93c7-662c89983eda-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.748428 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998" (OuterVolumeSpecName: "kube-api-access-5q998") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "kube-api-access-5q998". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.748538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts" (OuterVolumeSpecName: "scripts") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.749826 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.817181 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.817472 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q998\" (UniqueName: \"kubernetes.io/projected/d9d22557-1d53-4138-93c7-662c89983eda-kube-api-access-5q998\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.817483 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.826950 4929 generic.go:334] "Generic (PLEG): container finished" podID="d9d22557-1d53-4138-93c7-662c89983eda" containerID="45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c" exitCode=0 Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.827073 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerDied","Data":"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c"} Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.827109 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9d22557-1d53-4138-93c7-662c89983eda","Type":"ContainerDied","Data":"489425f018f2cc6aa38fd5ed07454b25b1d947134f9e7f0a42d84b9e620da5fc"} Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.827131 4929 scope.go:117] "RemoveContainer" containerID="45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.827163 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.840143 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.856030 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e3514ec2-d436-4bdb-8d89-ffb37157ac2d","Type":"ContainerStarted","Data":"ecce17f71a2bce9578f2b7987bc6ea44bcf1c87e073c286b636b92bf1235e87c"} Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.856075 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"e3514ec2-d436-4bdb-8d89-ffb37157ac2d","Type":"ContainerStarted","Data":"411b9eb208d382c7d4a93267c693731de761b896d79e1c0759116d4df1584604"} Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.880927 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.901489 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data" (OuterVolumeSpecName: "config-data") pod "d9d22557-1d53-4138-93c7-662c89983eda" (UID: "d9d22557-1d53-4138-93c7-662c89983eda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.910723 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=1.988658885 podStartE2EDuration="2.910701721s" podCreationTimestamp="2025-10-02 12:46:52 +0000 UTC" firstStartedPulling="2025-10-02 12:46:53.060582508 +0000 UTC m=+5813.610948872" lastFinishedPulling="2025-10-02 12:46:53.982625344 +0000 UTC m=+5814.532991708" observedRunningTime="2025-10-02 12:46:54.899385491 +0000 UTC m=+5815.449751845" watchObservedRunningTime="2025-10-02 12:46:54.910701721 +0000 UTC m=+5815.461068085" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.919682 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.919895 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d22557-1d53-4138-93c7-662c89983eda-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.953140 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.955808 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.959393 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:46:54 crc kubenswrapper[4929]: I1002 12:46:54.996691 4929 scope.go:117] "RemoveContainer" containerID="6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.010139 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.011069 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.011166 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.013206 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.018149 4929 scope.go:117] "RemoveContainer" containerID="45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c" Oct 02 12:46:55 crc kubenswrapper[4929]: E1002 12:46:55.018758 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c\": container with ID starting with 45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c not found: ID does not exist" containerID="45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.018805 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c"} err="failed to get container status \"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c\": rpc error: code = NotFound desc = could not find container \"45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c\": container with ID starting with 45348406489a56157ea7ccb41297277e4f24a6f7740ebb85fc1484ffc78ee49c not found: ID does not exist" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.018844 4929 scope.go:117] "RemoveContainer" containerID="6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240" Oct 02 12:46:55 crc kubenswrapper[4929]: E1002 12:46:55.019350 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240\": container with ID starting with 6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240 not found: ID does not exist" containerID="6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.019388 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240"} err="failed to get container status \"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240\": rpc error: code = NotFound desc = could not find container \"6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240\": container with ID starting with 6f630c8a731f46ce04cbb22914bf1ce0b3e950df390db06f6fbf3bb675396240 not found: ID does not exist" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.211775 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.226064 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.236928 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:55 crc kubenswrapper[4929]: E1002 12:46:55.237502 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.237574 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api" Oct 02 12:46:55 crc kubenswrapper[4929]: E1002 12:46:55.237664 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api-log" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.237734 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api-log" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.238405 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api-log" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.238449 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d22557-1d53-4138-93c7-662c89983eda" containerName="cinder-api" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.239846 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.243238 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.245580 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.327715 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.327888 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff883b-f424-4808-bc97-530931bc3025-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.328046 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-scripts\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.328085 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs69w\" (UniqueName: \"kubernetes.io/projected/85ff883b-f424-4808-bc97-530931bc3025-kube-api-access-vs69w\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.328121 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data-custom\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.328238 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.328384 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ff883b-f424-4808-bc97-530931bc3025-logs\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430482 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs69w\" (UniqueName: \"kubernetes.io/projected/85ff883b-f424-4808-bc97-530931bc3025-kube-api-access-vs69w\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430533 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data-custom\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430578 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430641 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ff883b-f424-4808-bc97-530931bc3025-logs\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430674 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430699 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff883b-f424-4808-bc97-530931bc3025-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.430744 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-scripts\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.431904 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85ff883b-f424-4808-bc97-530931bc3025-logs\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.432598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85ff883b-f424-4808-bc97-530931bc3025-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.436138 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-scripts\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.436684 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.442073 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data-custom\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.442582 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ff883b-f424-4808-bc97-530931bc3025-config-data\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.449924 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs69w\" (UniqueName: \"kubernetes.io/projected/85ff883b-f424-4808-bc97-530931bc3025-kube-api-access-vs69w\") pod \"cinder-api-0\" (UID: \"85ff883b-f424-4808-bc97-530931bc3025\") " pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.564033 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.878991 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.881303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:46:55 crc kubenswrapper[4929]: I1002 12:46:55.883772 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:46:56 crc kubenswrapper[4929]: I1002 12:46:56.025529 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:46:56 crc kubenswrapper[4929]: I1002 12:46:56.171838 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d22557-1d53-4138-93c7-662c89983eda" path="/var/lib/kubelet/pods/d9d22557-1d53-4138-93c7-662c89983eda/volumes" Oct 02 12:46:56 crc kubenswrapper[4929]: I1002 12:46:56.900193 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 02 12:46:56 crc kubenswrapper[4929]: I1002 12:46:56.921821 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85ff883b-f424-4808-bc97-530931bc3025","Type":"ContainerStarted","Data":"1be9869df597a6cc2b3f3582837db79bc25d7d8f4201f45fc1e1881398b5206e"} Oct 02 12:46:56 crc kubenswrapper[4929]: I1002 12:46:56.921880 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85ff883b-f424-4808-bc97-530931bc3025","Type":"ContainerStarted","Data":"60396f3e9a178cf2d439d6d05af695085372bfbc6b963bdafd038cbc672980e9"} Oct 02 12:46:57 crc kubenswrapper[4929]: I1002 12:46:57.532123 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 02 12:46:57 crc kubenswrapper[4929]: I1002 12:46:57.954082 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85ff883b-f424-4808-bc97-530931bc3025","Type":"ContainerStarted","Data":"d34c6c5d8fece78e48e1c1c2e7ac5ce41f5df6a1b022af1319cb01dc8f7134b2"} Oct 02 12:46:57 crc kubenswrapper[4929]: I1002 12:46:57.954417 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 12:46:57 crc kubenswrapper[4929]: I1002 12:46:57.971154 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.971133879 podStartE2EDuration="2.971133879s" podCreationTimestamp="2025-10-02 12:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:46:57.968818093 +0000 UTC m=+5818.519184457" watchObservedRunningTime="2025-10-02 12:46:57.971133879 +0000 UTC m=+5818.521500243" Oct 02 12:47:00 crc kubenswrapper[4929]: I1002 12:47:00.090674 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 12:47:00 crc kubenswrapper[4929]: I1002 12:47:00.142518 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:00 crc kubenswrapper[4929]: I1002 12:47:00.976178 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="cinder-scheduler" containerID="cri-o://443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e" gracePeriod=30 Oct 02 12:47:00 crc kubenswrapper[4929]: I1002 12:47:00.976218 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="probe" containerID="cri-o://9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5" gracePeriod=30 Oct 02 12:47:01 crc kubenswrapper[4929]: I1002 12:47:01.987118 4929 generic.go:334] "Generic (PLEG): container finished" podID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerID="9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5" exitCode=0 Oct 02 12:47:01 crc kubenswrapper[4929]: I1002 12:47:01.987170 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerDied","Data":"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5"} Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.111870 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.459871 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574099 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574179 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574257 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574286 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574323 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.574426 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvgp6\" (UniqueName: \"kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6\") pod \"cef2a42b-e101-4ae6-8161-173fc67300c6\" (UID: \"cef2a42b-e101-4ae6-8161-173fc67300c6\") " Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.575587 4929 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cef2a42b-e101-4ae6-8161-173fc67300c6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.579519 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts" (OuterVolumeSpecName: "scripts") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.580380 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.580522 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6" (OuterVolumeSpecName: "kube-api-access-tvgp6") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "kube-api-access-tvgp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.644291 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.667895 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data" (OuterVolumeSpecName: "config-data") pod "cef2a42b-e101-4ae6-8161-173fc67300c6" (UID: "cef2a42b-e101-4ae6-8161-173fc67300c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.677118 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvgp6\" (UniqueName: \"kubernetes.io/projected/cef2a42b-e101-4ae6-8161-173fc67300c6-kube-api-access-tvgp6\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.677149 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.677158 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.677167 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.677177 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef2a42b-e101-4ae6-8161-173fc67300c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:02 crc kubenswrapper[4929]: I1002 12:47:02.746942 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:02.999179 4929 generic.go:334] "Generic (PLEG): container finished" podID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerID="443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e" exitCode=0 Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:02.999226 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerDied","Data":"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e"} Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:02.999257 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cef2a42b-e101-4ae6-8161-173fc67300c6","Type":"ContainerDied","Data":"a77b4d4e47bb24e2f2fccfcb9e92c29eca5d94ae3794ba9872627a16047968b8"} Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:02.999279 4929 scope.go:117] "RemoveContainer" containerID="9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:02.999436 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.046858 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.050487 4929 scope.go:117] "RemoveContainer" containerID="443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.068192 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.074575 4929 scope.go:117] "RemoveContainer" containerID="9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5" Oct 02 12:47:03 crc kubenswrapper[4929]: E1002 12:47:03.075568 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5\": container with ID starting with 9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5 not found: ID does not exist" containerID="9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.075623 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5"} err="failed to get container status \"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5\": rpc error: code = NotFound desc = could not find container \"9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5\": container with ID starting with 9f07a87656522f4f3ecb8048323914a11db4a07119d72e390bde03caa14012c5 not found: ID does not exist" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.075645 4929 scope.go:117] "RemoveContainer" containerID="443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e" Oct 02 12:47:03 crc kubenswrapper[4929]: E1002 12:47:03.075936 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e\": container with ID starting with 443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e not found: ID does not exist" containerID="443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.075974 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e"} err="failed to get container status \"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e\": rpc error: code = NotFound desc = could not find container \"443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e\": container with ID starting with 443aa5238ba993713b7368bcedebc7a68590c7fa1c2645f76cdb4186a678c89e not found: ID does not exist" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.077417 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:03 crc kubenswrapper[4929]: E1002 12:47:03.077816 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="probe" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.077834 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="probe" Oct 02 12:47:03 crc kubenswrapper[4929]: E1002 12:47:03.077859 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="cinder-scheduler" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.077866 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="cinder-scheduler" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.078081 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="probe" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.078105 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" containerName="cinder-scheduler" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.079108 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.082672 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.092053 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187478 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187544 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/692a38b4-b060-4289-90b5-224e45ed83ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187591 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187618 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76btz\" (UniqueName: \"kubernetes.io/projected/692a38b4-b060-4289-90b5-224e45ed83ca-kube-api-access-76btz\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187681 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.187700 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289625 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289673 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289778 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289813 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/692a38b4-b060-4289-90b5-224e45ed83ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289836 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.289857 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76btz\" (UniqueName: \"kubernetes.io/projected/692a38b4-b060-4289-90b5-224e45ed83ca-kube-api-access-76btz\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.290131 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/692a38b4-b060-4289-90b5-224e45ed83ca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.295124 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.295380 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-scripts\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.295790 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.296298 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/692a38b4-b060-4289-90b5-224e45ed83ca-config-data\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.304498 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76btz\" (UniqueName: \"kubernetes.io/projected/692a38b4-b060-4289-90b5-224e45ed83ca-kube-api-access-76btz\") pod \"cinder-scheduler-0\" (UID: \"692a38b4-b060-4289-90b5-224e45ed83ca\") " pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.402228 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:47:03 crc kubenswrapper[4929]: I1002 12:47:03.849937 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:47:04 crc kubenswrapper[4929]: I1002 12:47:04.012250 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"692a38b4-b060-4289-90b5-224e45ed83ca","Type":"ContainerStarted","Data":"d574435d0926ffe74281d11defe667a4da8aff51d2f5104d95017d40754dd6dd"} Oct 02 12:47:04 crc kubenswrapper[4929]: I1002 12:47:04.170135 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef2a42b-e101-4ae6-8161-173fc67300c6" path="/var/lib/kubelet/pods/cef2a42b-e101-4ae6-8161-173fc67300c6/volumes" Oct 02 12:47:05 crc kubenswrapper[4929]: I1002 12:47:05.022316 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"692a38b4-b060-4289-90b5-224e45ed83ca","Type":"ContainerStarted","Data":"9922dcd7c6a77072b342e6bb976b52d7c28387e29c3db499b880aafc3a75e008"} Oct 02 12:47:05 crc kubenswrapper[4929]: I1002 12:47:05.022697 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"692a38b4-b060-4289-90b5-224e45ed83ca","Type":"ContainerStarted","Data":"ba6633ff54dd83e0b8d8e5a095c50785ab1c59104ec56dd0939dae8ad87deeb8"} Oct 02 12:47:05 crc kubenswrapper[4929]: I1002 12:47:05.040116 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.040097058 podStartE2EDuration="2.040097058s" podCreationTimestamp="2025-10-02 12:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:47:05.03770928 +0000 UTC m=+5825.588075654" watchObservedRunningTime="2025-10-02 12:47:05.040097058 +0000 UTC m=+5825.590463422" Oct 02 12:47:07 crc kubenswrapper[4929]: I1002 12:47:07.416861 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 12:47:08 crc kubenswrapper[4929]: I1002 12:47:08.403382 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 12:47:13 crc kubenswrapper[4929]: I1002 12:47:13.593238 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 12:47:23 crc kubenswrapper[4929]: I1002 12:47:23.082896 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-n7cpg"] Oct 02 12:47:23 crc kubenswrapper[4929]: I1002 12:47:23.091740 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-n7cpg"] Oct 02 12:47:24 crc kubenswrapper[4929]: I1002 12:47:24.169500 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1da2e6-cf33-4709-a04f-6749a089638b" path="/var/lib/kubelet/pods/0a1da2e6-cf33-4709-a04f-6749a089638b/volumes" Oct 02 12:47:34 crc kubenswrapper[4929]: I1002 12:47:34.049591 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bf22-account-create-thfq5"] Oct 02 12:47:34 crc kubenswrapper[4929]: I1002 12:47:34.061420 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bf22-account-create-thfq5"] Oct 02 12:47:34 crc kubenswrapper[4929]: I1002 12:47:34.167726 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877dbf06-5427-494f-bc8a-529830e2180e" path="/var/lib/kubelet/pods/877dbf06-5427-494f-bc8a-529830e2180e/volumes" Oct 02 12:47:40 crc kubenswrapper[4929]: I1002 12:47:40.033500 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-h69g2"] Oct 02 12:47:40 crc kubenswrapper[4929]: I1002 12:47:40.043117 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-h69g2"] Oct 02 12:47:40 crc kubenswrapper[4929]: I1002 12:47:40.169154 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8670e7-a468-4168-97fd-5bf24b827746" path="/var/lib/kubelet/pods/8c8670e7-a468-4168-97fd-5bf24b827746/volumes" Oct 02 12:47:54 crc kubenswrapper[4929]: I1002 12:47:54.030695 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f2v2l"] Oct 02 12:47:54 crc kubenswrapper[4929]: I1002 12:47:54.040042 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f2v2l"] Oct 02 12:47:54 crc kubenswrapper[4929]: I1002 12:47:54.166677 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51638f83-9423-4982-a164-4483048e76a8" path="/var/lib/kubelet/pods/51638f83-9423-4982-a164-4483048e76a8/volumes" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.727704 4929 scope.go:117] "RemoveContainer" containerID="4660f85746d9dc833cfb15cd11f1678298ae51af202e38e485aa91eeef8a835a" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.753533 4929 scope.go:117] "RemoveContainer" containerID="5ab0eaa73d287fd2f53824262aafaf7dd0296e0b449b7fc021a08fe107cce0ce" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.793483 4929 scope.go:117] "RemoveContainer" containerID="b9503834e331999ded2167675da7403e8f1625e1044dbdbf60123c822b59455d" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.844844 4929 scope.go:117] "RemoveContainer" containerID="2e7881f32499e05fb1678074b477846d647205c96e4702c4d86791d0e2a847ff" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.881350 4929 scope.go:117] "RemoveContainer" containerID="68d6dc446318d6f8a2390fe134f37903397a72dad02ebe947e0a8176f2141604" Oct 02 12:48:16 crc kubenswrapper[4929]: I1002 12:48:16.915520 4929 scope.go:117] "RemoveContainer" containerID="8f7d43c71d26b46dbe717daba829cf49f424f8e3d3843e4840dcc24fea3e2375" Oct 02 12:48:44 crc kubenswrapper[4929]: I1002 12:48:44.737313 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:48:44 crc kubenswrapper[4929]: I1002 12:48:44.738631 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.598239 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sc5zm"] Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.600151 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.602325 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9hqv8" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.603986 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djncn\" (UniqueName: \"kubernetes.io/projected/1ee7c592-3942-47da-9be7-e146a4768544-kube-api-access-djncn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.604028 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee7c592-3942-47da-9be7-e146a4768544-scripts\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.604102 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-log-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.604134 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.604159 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.605984 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.613486 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hpmws"] Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.615918 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.625561 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm"] Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.649304 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hpmws"] Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.705818 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-lib\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.705857 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15929c9-ed67-44ce-8158-b85635f32121-scripts\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.705921 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksd49\" (UniqueName: \"kubernetes.io/projected/e15929c9-ed67-44ce-8158-b85635f32121-kube-api-access-ksd49\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.705982 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-run\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706008 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-etc-ovs\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706097 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djncn\" (UniqueName: \"kubernetes.io/projected/1ee7c592-3942-47da-9be7-e146a4768544-kube-api-access-djncn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706121 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee7c592-3942-47da-9be7-e146a4768544-scripts\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706170 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-log-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706191 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706217 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-log\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706233 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706694 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706723 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-log-ovn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.706760 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ee7c592-3942-47da-9be7-e146a4768544-var-run\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.709007 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ee7c592-3942-47da-9be7-e146a4768544-scripts\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.737392 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djncn\" (UniqueName: \"kubernetes.io/projected/1ee7c592-3942-47da-9be7-e146a4768544-kube-api-access-djncn\") pod \"ovn-controller-sc5zm\" (UID: \"1ee7c592-3942-47da-9be7-e146a4768544\") " pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.809701 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-log\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810070 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-lib\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810097 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15929c9-ed67-44ce-8158-b85635f32121-scripts\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.809889 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-log\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810400 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksd49\" (UniqueName: \"kubernetes.io/projected/e15929c9-ed67-44ce-8158-b85635f32121-kube-api-access-ksd49\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810432 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-run\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810490 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-etc-ovs\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.810700 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-etc-ovs\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.811112 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-run\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.812553 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15929c9-ed67-44ce-8158-b85635f32121-scripts\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.812668 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e15929c9-ed67-44ce-8158-b85635f32121-var-lib\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.829106 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksd49\" (UniqueName: \"kubernetes.io/projected/e15929c9-ed67-44ce-8158-b85635f32121-kube-api-access-ksd49\") pod \"ovn-controller-ovs-hpmws\" (UID: \"e15929c9-ed67-44ce-8158-b85635f32121\") " pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.927027 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:54 crc kubenswrapper[4929]: I1002 12:48:54.940882 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.529735 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm"] Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.819442 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hpmws"] Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.916874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-w7jz9"] Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.918332 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.937845 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.944396 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovs-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.944596 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-config\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.944818 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjkc8\" (UniqueName: \"kubernetes.io/projected/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-kube-api-access-qjkc8\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.944919 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovn-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:55 crc kubenswrapper[4929]: I1002 12:48:55.976828 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w7jz9"] Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.019236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hpmws" event={"ID":"e15929c9-ed67-44ce-8158-b85635f32121","Type":"ContainerStarted","Data":"ce03a955d6093d9de701ff6c32e846d2286709b090dba41c34e64aa38ad0ecad"} Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.020308 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm" event={"ID":"1ee7c592-3942-47da-9be7-e146a4768544","Type":"ContainerStarted","Data":"aae7514c559863dcb48549208dd767663ad984596063cf5182b28e6868746bf8"} Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048052 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjkc8\" (UniqueName: \"kubernetes.io/projected/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-kube-api-access-qjkc8\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048110 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovn-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048200 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovs-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048229 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-config\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048575 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovs-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048574 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-ovn-rundir\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.048847 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-config\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.081174 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjkc8\" (UniqueName: \"kubernetes.io/projected/cca7d8a2-15be-44ac-9bb5-4aad5973a40d-kube-api-access-qjkc8\") pod \"ovn-controller-metrics-w7jz9\" (UID: \"cca7d8a2-15be-44ac-9bb5-4aad5973a40d\") " pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.314394 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-w7jz9" Oct 02 12:48:56 crc kubenswrapper[4929]: I1002 12:48:56.932603 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-w7jz9"] Oct 02 12:48:56 crc kubenswrapper[4929]: W1002 12:48:56.939548 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcca7d8a2_15be_44ac_9bb5_4aad5973a40d.slice/crio-ee383cbd116400bde61469e568b507326388dd625b072e745fb2fe3d7518e66c WatchSource:0}: Error finding container ee383cbd116400bde61469e568b507326388dd625b072e745fb2fe3d7518e66c: Status 404 returned error can't find the container with id ee383cbd116400bde61469e568b507326388dd625b072e745fb2fe3d7518e66c Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.029936 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hpmws" event={"ID":"e15929c9-ed67-44ce-8158-b85635f32121","Type":"ContainerStarted","Data":"ee798d39bec5ec383c0a35d54c9d8abd4ed35dcf4e0b661063303105f9342c1c"} Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.033029 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm" event={"ID":"1ee7c592-3942-47da-9be7-e146a4768544","Type":"ContainerStarted","Data":"9f54b10ac5da4f02821d0adef0a05b4f887a6433da60cb3aac02911d0c334cce"} Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.033105 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sc5zm" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.035090 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w7jz9" event={"ID":"cca7d8a2-15be-44ac-9bb5-4aad5973a40d","Type":"ContainerStarted","Data":"ee383cbd116400bde61469e568b507326388dd625b072e745fb2fe3d7518e66c"} Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.076498 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sc5zm" podStartSLOduration=3.076477829 podStartE2EDuration="3.076477829s" podCreationTimestamp="2025-10-02 12:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:48:57.067672319 +0000 UTC m=+5937.618038683" watchObservedRunningTime="2025-10-02 12:48:57.076477829 +0000 UTC m=+5937.626844193" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.625367 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-4kqts"] Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.627136 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4kqts" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.657744 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4kqts"] Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.788711 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k67lp\" (UniqueName: \"kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp\") pod \"octavia-db-create-4kqts\" (UID: \"8a9c92bb-12e4-4a69-ba64-217902786770\") " pod="openstack/octavia-db-create-4kqts" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.890146 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k67lp\" (UniqueName: \"kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp\") pod \"octavia-db-create-4kqts\" (UID: \"8a9c92bb-12e4-4a69-ba64-217902786770\") " pod="openstack/octavia-db-create-4kqts" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.908095 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k67lp\" (UniqueName: \"kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp\") pod \"octavia-db-create-4kqts\" (UID: \"8a9c92bb-12e4-4a69-ba64-217902786770\") " pod="openstack/octavia-db-create-4kqts" Oct 02 12:48:57 crc kubenswrapper[4929]: I1002 12:48:57.950533 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4kqts" Oct 02 12:48:58 crc kubenswrapper[4929]: I1002 12:48:58.056661 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-w7jz9" event={"ID":"cca7d8a2-15be-44ac-9bb5-4aad5973a40d","Type":"ContainerStarted","Data":"8e906b2dc1801a4f12dcddac5c3d78c4d1ea04ef79be52d36d06691d9a4fe9ed"} Oct 02 12:48:58 crc kubenswrapper[4929]: I1002 12:48:58.065109 4929 generic.go:334] "Generic (PLEG): container finished" podID="e15929c9-ed67-44ce-8158-b85635f32121" containerID="ee798d39bec5ec383c0a35d54c9d8abd4ed35dcf4e0b661063303105f9342c1c" exitCode=0 Oct 02 12:48:58 crc kubenswrapper[4929]: I1002 12:48:58.065668 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hpmws" event={"ID":"e15929c9-ed67-44ce-8158-b85635f32121","Type":"ContainerDied","Data":"ee798d39bec5ec383c0a35d54c9d8abd4ed35dcf4e0b661063303105f9342c1c"} Oct 02 12:48:58 crc kubenswrapper[4929]: I1002 12:48:58.088123 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-w7jz9" podStartSLOduration=3.088100514 podStartE2EDuration="3.088100514s" podCreationTimestamp="2025-10-02 12:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:48:58.080018485 +0000 UTC m=+5938.630384849" watchObservedRunningTime="2025-10-02 12:48:58.088100514 +0000 UTC m=+5938.638466878" Oct 02 12:48:58 crc kubenswrapper[4929]: I1002 12:48:58.455803 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-4kqts"] Oct 02 12:48:58 crc kubenswrapper[4929]: W1002 12:48:58.456556 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9c92bb_12e4_4a69_ba64_217902786770.slice/crio-33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899 WatchSource:0}: Error finding container 33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899: Status 404 returned error can't find the container with id 33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899 Oct 02 12:48:59 crc kubenswrapper[4929]: I1002 12:48:59.075187 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4kqts" event={"ID":"8a9c92bb-12e4-4a69-ba64-217902786770","Type":"ContainerStarted","Data":"695d8e4de01a2142e05074a7c059e6c674c40b111e7065b037edd1f2f625bcd4"} Oct 02 12:48:59 crc kubenswrapper[4929]: I1002 12:48:59.075263 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4kqts" event={"ID":"8a9c92bb-12e4-4a69-ba64-217902786770","Type":"ContainerStarted","Data":"33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899"} Oct 02 12:48:59 crc kubenswrapper[4929]: I1002 12:48:59.079657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hpmws" event={"ID":"e15929c9-ed67-44ce-8158-b85635f32121","Type":"ContainerStarted","Data":"c42b1f3d567f0e7e05db15602cc0719d2a21a5c5fc3f63f25f6fa816d6906a89"} Oct 02 12:48:59 crc kubenswrapper[4929]: I1002 12:48:59.100588 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-create-4kqts" podStartSLOduration=2.100567262 podStartE2EDuration="2.100567262s" podCreationTimestamp="2025-10-02 12:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:48:59.088499 +0000 UTC m=+5939.638865354" watchObservedRunningTime="2025-10-02 12:48:59.100567262 +0000 UTC m=+5939.650933626" Oct 02 12:49:00 crc kubenswrapper[4929]: I1002 12:49:00.097605 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hpmws" event={"ID":"e15929c9-ed67-44ce-8158-b85635f32121","Type":"ContainerStarted","Data":"f48fd2239583029ca9438a8a43fd8937ad2187aae653f6b8c760d6acbcd19633"} Oct 02 12:49:00 crc kubenswrapper[4929]: I1002 12:49:00.098050 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:49:00 crc kubenswrapper[4929]: I1002 12:49:00.099075 4929 generic.go:334] "Generic (PLEG): container finished" podID="8a9c92bb-12e4-4a69-ba64-217902786770" containerID="695d8e4de01a2142e05074a7c059e6c674c40b111e7065b037edd1f2f625bcd4" exitCode=0 Oct 02 12:49:00 crc kubenswrapper[4929]: I1002 12:49:00.099111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4kqts" event={"ID":"8a9c92bb-12e4-4a69-ba64-217902786770","Type":"ContainerDied","Data":"695d8e4de01a2142e05074a7c059e6c674c40b111e7065b037edd1f2f625bcd4"} Oct 02 12:49:00 crc kubenswrapper[4929]: I1002 12:49:00.121176 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hpmws" podStartSLOduration=6.121157441 podStartE2EDuration="6.121157441s" podCreationTimestamp="2025-10-02 12:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:49:00.118523276 +0000 UTC m=+5940.668889650" watchObservedRunningTime="2025-10-02 12:49:00.121157441 +0000 UTC m=+5940.671523805" Oct 02 12:49:01 crc kubenswrapper[4929]: I1002 12:49:01.112221 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:49:01 crc kubenswrapper[4929]: I1002 12:49:01.495445 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4kqts" Oct 02 12:49:01 crc kubenswrapper[4929]: I1002 12:49:01.664979 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k67lp\" (UniqueName: \"kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp\") pod \"8a9c92bb-12e4-4a69-ba64-217902786770\" (UID: \"8a9c92bb-12e4-4a69-ba64-217902786770\") " Oct 02 12:49:01 crc kubenswrapper[4929]: I1002 12:49:01.672611 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp" (OuterVolumeSpecName: "kube-api-access-k67lp") pod "8a9c92bb-12e4-4a69-ba64-217902786770" (UID: "8a9c92bb-12e4-4a69-ba64-217902786770"). InnerVolumeSpecName "kube-api-access-k67lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:01 crc kubenswrapper[4929]: I1002 12:49:01.767830 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k67lp\" (UniqueName: \"kubernetes.io/projected/8a9c92bb-12e4-4a69-ba64-217902786770-kube-api-access-k67lp\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:02 crc kubenswrapper[4929]: I1002 12:49:02.134933 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-4kqts" Oct 02 12:49:02 crc kubenswrapper[4929]: I1002 12:49:02.135163 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-4kqts" event={"ID":"8a9c92bb-12e4-4a69-ba64-217902786770","Type":"ContainerDied","Data":"33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899"} Oct 02 12:49:02 crc kubenswrapper[4929]: I1002 12:49:02.136522 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ea590b909f0a56da2649ae6f75536d443ef5d87d37f5ee134ea3a782214899" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.400330 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-67ef-account-create-5l7xv"] Oct 02 12:49:10 crc kubenswrapper[4929]: E1002 12:49:10.401344 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9c92bb-12e4-4a69-ba64-217902786770" containerName="mariadb-database-create" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.401358 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9c92bb-12e4-4a69-ba64-217902786770" containerName="mariadb-database-create" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.401557 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9c92bb-12e4-4a69-ba64-217902786770" containerName="mariadb-database-create" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.402237 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.404563 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.411286 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-67ef-account-create-5l7xv"] Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.450394 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnxl\" (UniqueName: \"kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl\") pod \"octavia-67ef-account-create-5l7xv\" (UID: \"3356ee2b-8c66-4380-ae15-60353943d39f\") " pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.552731 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnxl\" (UniqueName: \"kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl\") pod \"octavia-67ef-account-create-5l7xv\" (UID: \"3356ee2b-8c66-4380-ae15-60353943d39f\") " pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.592770 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnxl\" (UniqueName: \"kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl\") pod \"octavia-67ef-account-create-5l7xv\" (UID: \"3356ee2b-8c66-4380-ae15-60353943d39f\") " pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:10 crc kubenswrapper[4929]: I1002 12:49:10.731902 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:11 crc kubenswrapper[4929]: I1002 12:49:11.180354 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-67ef-account-create-5l7xv"] Oct 02 12:49:11 crc kubenswrapper[4929]: I1002 12:49:11.214110 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-67ef-account-create-5l7xv" event={"ID":"3356ee2b-8c66-4380-ae15-60353943d39f","Type":"ContainerStarted","Data":"79cd115d94ef8178c03adda522673fb55e1391741f59283cc7f8df77c265f8a8"} Oct 02 12:49:12 crc kubenswrapper[4929]: I1002 12:49:12.223244 4929 generic.go:334] "Generic (PLEG): container finished" podID="3356ee2b-8c66-4380-ae15-60353943d39f" containerID="618fbc37d8d67855a8c79f70c2824a3d62c20ae082fcfe5b4090dee16ff8c893" exitCode=0 Oct 02 12:49:12 crc kubenswrapper[4929]: I1002 12:49:12.223338 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-67ef-account-create-5l7xv" event={"ID":"3356ee2b-8c66-4380-ae15-60353943d39f","Type":"ContainerDied","Data":"618fbc37d8d67855a8c79f70c2824a3d62c20ae082fcfe5b4090dee16ff8c893"} Oct 02 12:49:13 crc kubenswrapper[4929]: I1002 12:49:13.565652 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:13 crc kubenswrapper[4929]: I1002 12:49:13.712661 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnxl\" (UniqueName: \"kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl\") pod \"3356ee2b-8c66-4380-ae15-60353943d39f\" (UID: \"3356ee2b-8c66-4380-ae15-60353943d39f\") " Oct 02 12:49:13 crc kubenswrapper[4929]: I1002 12:49:13.719174 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl" (OuterVolumeSpecName: "kube-api-access-qfnxl") pod "3356ee2b-8c66-4380-ae15-60353943d39f" (UID: "3356ee2b-8c66-4380-ae15-60353943d39f"). InnerVolumeSpecName "kube-api-access-qfnxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:13 crc kubenswrapper[4929]: I1002 12:49:13.815236 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnxl\" (UniqueName: \"kubernetes.io/projected/3356ee2b-8c66-4380-ae15-60353943d39f-kube-api-access-qfnxl\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:14 crc kubenswrapper[4929]: I1002 12:49:14.247713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-67ef-account-create-5l7xv" event={"ID":"3356ee2b-8c66-4380-ae15-60353943d39f","Type":"ContainerDied","Data":"79cd115d94ef8178c03adda522673fb55e1391741f59283cc7f8df77c265f8a8"} Oct 02 12:49:14 crc kubenswrapper[4929]: I1002 12:49:14.248000 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cd115d94ef8178c03adda522673fb55e1391741f59283cc7f8df77c265f8a8" Oct 02 12:49:14 crc kubenswrapper[4929]: I1002 12:49:14.247815 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-67ef-account-create-5l7xv" Oct 02 12:49:14 crc kubenswrapper[4929]: I1002 12:49:14.736598 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:49:14 crc kubenswrapper[4929]: I1002 12:49:14.736945 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.411289 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-jxk5j"] Oct 02 12:49:16 crc kubenswrapper[4929]: E1002 12:49:16.412227 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356ee2b-8c66-4380-ae15-60353943d39f" containerName="mariadb-account-create" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.412245 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356ee2b-8c66-4380-ae15-60353943d39f" containerName="mariadb-account-create" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.412492 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3356ee2b-8c66-4380-ae15-60353943d39f" containerName="mariadb-account-create" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.413311 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.420306 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jxk5j"] Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.566371 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfnz\" (UniqueName: \"kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz\") pod \"octavia-persistence-db-create-jxk5j\" (UID: \"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847\") " pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.668767 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfnz\" (UniqueName: \"kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz\") pod \"octavia-persistence-db-create-jxk5j\" (UID: \"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847\") " pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.694729 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfnz\" (UniqueName: \"kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz\") pod \"octavia-persistence-db-create-jxk5j\" (UID: \"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847\") " pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:16 crc kubenswrapper[4929]: I1002 12:49:16.738537 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:17 crc kubenswrapper[4929]: I1002 12:49:17.213875 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-jxk5j"] Oct 02 12:49:17 crc kubenswrapper[4929]: I1002 12:49:17.280471 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jxk5j" event={"ID":"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847","Type":"ContainerStarted","Data":"b34539f3c8254d4b859955d03ec8d1312a82e388c9e86ad328353d0b5f0dbfc9"} Oct 02 12:49:18 crc kubenswrapper[4929]: I1002 12:49:18.290718 4929 generic.go:334] "Generic (PLEG): container finished" podID="1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" containerID="39eb23fb45a461061f126955d6db4b7ae637b25c3cdea476668d967ab4911258" exitCode=0 Oct 02 12:49:18 crc kubenswrapper[4929]: I1002 12:49:18.290828 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jxk5j" event={"ID":"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847","Type":"ContainerDied","Data":"39eb23fb45a461061f126955d6db4b7ae637b25c3cdea476668d967ab4911258"} Oct 02 12:49:19 crc kubenswrapper[4929]: I1002 12:49:19.733343 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:19 crc kubenswrapper[4929]: I1002 12:49:19.824995 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfnz\" (UniqueName: \"kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz\") pod \"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847\" (UID: \"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847\") " Oct 02 12:49:19 crc kubenswrapper[4929]: I1002 12:49:19.832249 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz" (OuterVolumeSpecName: "kube-api-access-5sfnz") pod "1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" (UID: "1e7e6ee8-1706-4e44-a7e4-a932a2bbd847"). InnerVolumeSpecName "kube-api-access-5sfnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:19 crc kubenswrapper[4929]: I1002 12:49:19.928140 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfnz\" (UniqueName: \"kubernetes.io/projected/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847-kube-api-access-5sfnz\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:20 crc kubenswrapper[4929]: I1002 12:49:20.311047 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-jxk5j" event={"ID":"1e7e6ee8-1706-4e44-a7e4-a932a2bbd847","Type":"ContainerDied","Data":"b34539f3c8254d4b859955d03ec8d1312a82e388c9e86ad328353d0b5f0dbfc9"} Oct 02 12:49:20 crc kubenswrapper[4929]: I1002 12:49:20.311422 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b34539f3c8254d4b859955d03ec8d1312a82e388c9e86ad328353d0b5f0dbfc9" Oct 02 12:49:20 crc kubenswrapper[4929]: I1002 12:49:20.311150 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-jxk5j" Oct 02 12:49:20 crc kubenswrapper[4929]: E1002 12:49:20.370389 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7e6ee8_1706_4e44_a7e4_a932a2bbd847.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7e6ee8_1706_4e44_a7e4_a932a2bbd847.slice/crio-b34539f3c8254d4b859955d03ec8d1312a82e388c9e86ad328353d0b5f0dbfc9\": RecentStats: unable to find data in memory cache]" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.500180 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-2514-account-create-6sxwx"] Oct 02 12:49:27 crc kubenswrapper[4929]: E1002 12:49:27.501160 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" containerName="mariadb-database-create" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.501179 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" containerName="mariadb-database-create" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.501433 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" containerName="mariadb-database-create" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.502324 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.505102 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.510873 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2514-account-create-6sxwx"] Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.586505 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp7cv\" (UniqueName: \"kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv\") pod \"octavia-2514-account-create-6sxwx\" (UID: \"c7e3189c-a438-45f6-bce7-8fa19475d892\") " pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.688351 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp7cv\" (UniqueName: \"kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv\") pod \"octavia-2514-account-create-6sxwx\" (UID: \"c7e3189c-a438-45f6-bce7-8fa19475d892\") " pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.707737 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp7cv\" (UniqueName: \"kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv\") pod \"octavia-2514-account-create-6sxwx\" (UID: \"c7e3189c-a438-45f6-bce7-8fa19475d892\") " pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:27 crc kubenswrapper[4929]: I1002 12:49:27.832662 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:28 crc kubenswrapper[4929]: I1002 12:49:28.275780 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2514-account-create-6sxwx"] Oct 02 12:49:28 crc kubenswrapper[4929]: I1002 12:49:28.390996 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2514-account-create-6sxwx" event={"ID":"c7e3189c-a438-45f6-bce7-8fa19475d892","Type":"ContainerStarted","Data":"ba5a1201e6edba89eeb230004e06f22ce55a4d9e62754ebdd422a42870e87820"} Oct 02 12:49:29 crc kubenswrapper[4929]: I1002 12:49:29.401811 4929 generic.go:334] "Generic (PLEG): container finished" podID="c7e3189c-a438-45f6-bce7-8fa19475d892" containerID="d5c91ada8cb08d807c5b628888ca43273aeeb9e86e739be2f7625d976861f8f0" exitCode=0 Oct 02 12:49:29 crc kubenswrapper[4929]: I1002 12:49:29.401926 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2514-account-create-6sxwx" event={"ID":"c7e3189c-a438-45f6-bce7-8fa19475d892","Type":"ContainerDied","Data":"d5c91ada8cb08d807c5b628888ca43273aeeb9e86e739be2f7625d976861f8f0"} Oct 02 12:49:29 crc kubenswrapper[4929]: I1002 12:49:29.970907 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sc5zm" podUID="1ee7c592-3942-47da-9be7-e146a4768544" containerName="ovn-controller" probeResult="failure" output=< Oct 02 12:49:29 crc kubenswrapper[4929]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 12:49:29 crc kubenswrapper[4929]: > Oct 02 12:49:29 crc kubenswrapper[4929]: I1002 12:49:29.987161 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:49:29 crc kubenswrapper[4929]: I1002 12:49:29.990292 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hpmws" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.129511 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sc5zm-config-jjmnw"] Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.130910 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.133709 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.137513 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm-config-jjmnw"] Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.240262 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcl2m\" (UniqueName: \"kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.240326 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.240415 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.240836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.240900 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.241096 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343458 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343633 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343659 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343733 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343821 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcl2m\" (UniqueName: \"kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343854 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.343979 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.344100 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.344161 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.344341 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.346352 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.364693 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcl2m\" (UniqueName: \"kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m\") pod \"ovn-controller-sc5zm-config-jjmnw\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.458753 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.807026 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.851675 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp7cv\" (UniqueName: \"kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv\") pod \"c7e3189c-a438-45f6-bce7-8fa19475d892\" (UID: \"c7e3189c-a438-45f6-bce7-8fa19475d892\") " Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.858115 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv" (OuterVolumeSpecName: "kube-api-access-tp7cv") pod "c7e3189c-a438-45f6-bce7-8fa19475d892" (UID: "c7e3189c-a438-45f6-bce7-8fa19475d892"). InnerVolumeSpecName "kube-api-access-tp7cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.954420 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp7cv\" (UniqueName: \"kubernetes.io/projected/c7e3189c-a438-45f6-bce7-8fa19475d892-kube-api-access-tp7cv\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:30 crc kubenswrapper[4929]: I1002 12:49:30.999251 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm-config-jjmnw"] Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.420905 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2514-account-create-6sxwx" event={"ID":"c7e3189c-a438-45f6-bce7-8fa19475d892","Type":"ContainerDied","Data":"ba5a1201e6edba89eeb230004e06f22ce55a4d9e62754ebdd422a42870e87820"} Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.421203 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5a1201e6edba89eeb230004e06f22ce55a4d9e62754ebdd422a42870e87820" Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.420917 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2514-account-create-6sxwx" Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.422549 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-jjmnw" event={"ID":"6a256b1e-9581-495c-8b93-03b86da55b1c","Type":"ContainerStarted","Data":"f8b188f9fcc84dee299361d4f076bc6e00809e83f10a83f783610cca84f2b0c6"} Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.422579 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-jjmnw" event={"ID":"6a256b1e-9581-495c-8b93-03b86da55b1c","Type":"ContainerStarted","Data":"20991d44da3ddd51c9fe78b2bf96e3421136583ee9638d4faeee1ffaa19c42f8"} Oct 02 12:49:31 crc kubenswrapper[4929]: I1002 12:49:31.445270 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sc5zm-config-jjmnw" podStartSLOduration=1.445247466 podStartE2EDuration="1.445247466s" podCreationTimestamp="2025-10-02 12:49:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:49:31.441621414 +0000 UTC m=+5971.991987788" watchObservedRunningTime="2025-10-02 12:49:31.445247466 +0000 UTC m=+5971.995613830" Oct 02 12:49:32 crc kubenswrapper[4929]: I1002 12:49:32.433079 4929 generic.go:334] "Generic (PLEG): container finished" podID="6a256b1e-9581-495c-8b93-03b86da55b1c" containerID="f8b188f9fcc84dee299361d4f076bc6e00809e83f10a83f783610cca84f2b0c6" exitCode=0 Oct 02 12:49:32 crc kubenswrapper[4929]: I1002 12:49:32.433147 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-jjmnw" event={"ID":"6a256b1e-9581-495c-8b93-03b86da55b1c","Type":"ContainerDied","Data":"f8b188f9fcc84dee299361d4f076bc6e00809e83f10a83f783610cca84f2b0c6"} Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.818858 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918110 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918438 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918236 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run" (OuterVolumeSpecName: "var-run") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918487 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcl2m\" (UniqueName: \"kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918562 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918683 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918730 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918800 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn\") pod \"6a256b1e-9581-495c-8b93-03b86da55b1c\" (UID: \"6a256b1e-9581-495c-8b93-03b86da55b1c\") " Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.918951 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.919259 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.919283 4929 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.919291 4929 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6a256b1e-9581-495c-8b93-03b86da55b1c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.919903 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts" (OuterVolumeSpecName: "scripts") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.920051 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:33 crc kubenswrapper[4929]: I1002 12:49:33.925352 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m" (OuterVolumeSpecName: "kube-api-access-gcl2m") pod "6a256b1e-9581-495c-8b93-03b86da55b1c" (UID: "6a256b1e-9581-495c-8b93-03b86da55b1c"). InnerVolumeSpecName "kube-api-access-gcl2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.021623 4929 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.021659 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcl2m\" (UniqueName: \"kubernetes.io/projected/6a256b1e-9581-495c-8b93-03b86da55b1c-kube-api-access-gcl2m\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.021672 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a256b1e-9581-495c-8b93-03b86da55b1c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.459130 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-jjmnw" event={"ID":"6a256b1e-9581-495c-8b93-03b86da55b1c","Type":"ContainerDied","Data":"20991d44da3ddd51c9fe78b2bf96e3421136583ee9638d4faeee1ffaa19c42f8"} Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.459181 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20991d44da3ddd51c9fe78b2bf96e3421136583ee9638d4faeee1ffaa19c42f8" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.459246 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-jjmnw" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.466203 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-5d7d68fcb6-96k42"] Oct 02 12:49:34 crc kubenswrapper[4929]: E1002 12:49:34.466727 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e3189c-a438-45f6-bce7-8fa19475d892" containerName="mariadb-account-create" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.466749 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e3189c-a438-45f6-bce7-8fa19475d892" containerName="mariadb-account-create" Oct 02 12:49:34 crc kubenswrapper[4929]: E1002 12:49:34.466764 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a256b1e-9581-495c-8b93-03b86da55b1c" containerName="ovn-config" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.466770 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a256b1e-9581-495c-8b93-03b86da55b1c" containerName="ovn-config" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.467000 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e3189c-a438-45f6-bce7-8fa19475d892" containerName="mariadb-account-create" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.467026 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a256b1e-9581-495c-8b93-03b86da55b1c" containerName="ovn-config" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.470125 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.471882 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.471903 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.472002 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-tw24v" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.508979 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5d7d68fcb6-96k42"] Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.532173 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-octavia-run\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.532236 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.532286 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-scripts\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.532376 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data-merged\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.532447 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-combined-ca-bundle\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.558080 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sc5zm-config-jjmnw"] Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.578378 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sc5zm-config-jjmnw"] Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.634784 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-scripts\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.635436 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data-merged\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.635612 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-combined-ca-bundle\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.635827 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-octavia-run\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.635891 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.636013 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data-merged\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.636881 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-octavia-run\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.641877 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-scripts\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.642138 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-combined-ca-bundle\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.667667 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sc5zm-config-74s9q"] Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.671217 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d602bfd-b57a-43b4-aaee-d2766cee3ec4-config-data\") pod \"octavia-api-5d7d68fcb6-96k42\" (UID: \"0d602bfd-b57a-43b4-aaee-d2766cee3ec4\") " pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.672896 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.676494 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.683090 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm-config-74s9q"] Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738324 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738630 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srh7c\" (UniqueName: \"kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738655 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738693 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738735 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.738943 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.800640 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841107 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841198 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srh7c\" (UniqueName: \"kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841232 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841344 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841403 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841759 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841769 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.841826 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.842595 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.843466 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.861289 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srh7c\" (UniqueName: \"kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c\") pod \"ovn-controller-sc5zm-config-74s9q\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:34 crc kubenswrapper[4929]: I1002 12:49:34.984739 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sc5zm" Oct 02 12:49:35 crc kubenswrapper[4929]: I1002 12:49:35.053753 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:35 crc kubenswrapper[4929]: I1002 12:49:35.332162 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-5d7d68fcb6-96k42"] Oct 02 12:49:35 crc kubenswrapper[4929]: I1002 12:49:35.470692 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5d7d68fcb6-96k42" event={"ID":"0d602bfd-b57a-43b4-aaee-d2766cee3ec4","Type":"ContainerStarted","Data":"29b902cce91469a6fb19e1debcfdc272f2818e0fb878d0f0c110792f4861ba4b"} Oct 02 12:49:35 crc kubenswrapper[4929]: I1002 12:49:35.583533 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sc5zm-config-74s9q"] Oct 02 12:49:36 crc kubenswrapper[4929]: I1002 12:49:36.194065 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a256b1e-9581-495c-8b93-03b86da55b1c" path="/var/lib/kubelet/pods/6a256b1e-9581-495c-8b93-03b86da55b1c/volumes" Oct 02 12:49:36 crc kubenswrapper[4929]: I1002 12:49:36.484585 4929 generic.go:334] "Generic (PLEG): container finished" podID="5f84c84a-8f09-435c-a479-e48389a459ff" containerID="96d9f03215cf48e90bc7d7889bea03f8a11e849239561d32e5d59b468bf6b35e" exitCode=0 Oct 02 12:49:36 crc kubenswrapper[4929]: I1002 12:49:36.484710 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-74s9q" event={"ID":"5f84c84a-8f09-435c-a479-e48389a459ff","Type":"ContainerDied","Data":"96d9f03215cf48e90bc7d7889bea03f8a11e849239561d32e5d59b468bf6b35e"} Oct 02 12:49:36 crc kubenswrapper[4929]: I1002 12:49:36.484994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-74s9q" event={"ID":"5f84c84a-8f09-435c-a479-e48389a459ff","Type":"ContainerStarted","Data":"3958a54b05a13e582eed343f95387c477478a158bc5761c4e4eb162a06096b2c"} Oct 02 12:49:37 crc kubenswrapper[4929]: I1002 12:49:37.899306 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028466 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028542 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028565 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028713 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srh7c\" (UniqueName: \"kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028780 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.028831 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts\") pod \"5f84c84a-8f09-435c-a479-e48389a459ff\" (UID: \"5f84c84a-8f09-435c-a479-e48389a459ff\") " Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.030313 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.030360 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.031628 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts" (OuterVolumeSpecName: "scripts") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.031657 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run" (OuterVolumeSpecName: "var-run") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.033287 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.037780 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c" (OuterVolumeSpecName: "kube-api-access-srh7c") pod "5f84c84a-8f09-435c-a479-e48389a459ff" (UID: "5f84c84a-8f09-435c-a479-e48389a459ff"). InnerVolumeSpecName "kube-api-access-srh7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131710 4929 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131752 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131761 4929 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131770 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srh7c\" (UniqueName: \"kubernetes.io/projected/5f84c84a-8f09-435c-a479-e48389a459ff-kube-api-access-srh7c\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131785 4929 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f84c84a-8f09-435c-a479-e48389a459ff-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.131793 4929 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5f84c84a-8f09-435c-a479-e48389a459ff-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.505543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sc5zm-config-74s9q" event={"ID":"5f84c84a-8f09-435c-a479-e48389a459ff","Type":"ContainerDied","Data":"3958a54b05a13e582eed343f95387c477478a158bc5761c4e4eb162a06096b2c"} Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.505594 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3958a54b05a13e582eed343f95387c477478a158bc5761c4e4eb162a06096b2c" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.505621 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sc5zm-config-74s9q" Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.971815 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sc5zm-config-74s9q"] Oct 02 12:49:38 crc kubenswrapper[4929]: I1002 12:49:38.982650 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sc5zm-config-74s9q"] Oct 02 12:49:40 crc kubenswrapper[4929]: I1002 12:49:40.169171 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f84c84a-8f09-435c-a479-e48389a459ff" path="/var/lib/kubelet/pods/5f84c84a-8f09-435c-a479-e48389a459ff/volumes" Oct 02 12:49:44 crc kubenswrapper[4929]: I1002 12:49:44.737116 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:49:44 crc kubenswrapper[4929]: I1002 12:49:44.737862 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:49:44 crc kubenswrapper[4929]: I1002 12:49:44.737917 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:49:44 crc kubenswrapper[4929]: I1002 12:49:44.738688 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:49:44 crc kubenswrapper[4929]: I1002 12:49:44.738734 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" gracePeriod=600 Oct 02 12:49:45 crc kubenswrapper[4929]: I1002 12:49:45.580440 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" exitCode=0 Oct 02 12:49:45 crc kubenswrapper[4929]: I1002 12:49:45.580801 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0"} Oct 02 12:49:45 crc kubenswrapper[4929]: I1002 12:49:45.580844 4929 scope.go:117] "RemoveContainer" containerID="fd3f3300044292572692a4205fb0d2be0b602520d522a4e0786217e15a1c757a" Oct 02 12:49:46 crc kubenswrapper[4929]: E1002 12:49:46.609704 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:49:47 crc kubenswrapper[4929]: I1002 12:49:47.599886 4929 generic.go:334] "Generic (PLEG): container finished" podID="0d602bfd-b57a-43b4-aaee-d2766cee3ec4" containerID="73b8263dd230da91c79760008e3179526974cfccbb2cd840e7533d7eb0ce0807" exitCode=0 Oct 02 12:49:47 crc kubenswrapper[4929]: I1002 12:49:47.599998 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5d7d68fcb6-96k42" event={"ID":"0d602bfd-b57a-43b4-aaee-d2766cee3ec4","Type":"ContainerDied","Data":"73b8263dd230da91c79760008e3179526974cfccbb2cd840e7533d7eb0ce0807"} Oct 02 12:49:47 crc kubenswrapper[4929]: I1002 12:49:47.604191 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:49:47 crc kubenswrapper[4929]: E1002 12:49:47.604509 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.472771 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:49:48 crc kubenswrapper[4929]: E1002 12:49:48.473842 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f84c84a-8f09-435c-a479-e48389a459ff" containerName="ovn-config" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.473862 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f84c84a-8f09-435c-a479-e48389a459ff" containerName="ovn-config" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.476423 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f84c84a-8f09-435c-a479-e48389a459ff" containerName="ovn-config" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.478245 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.494397 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.537634 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.538034 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgmk\" (UniqueName: \"kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.538291 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.616173 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5d7d68fcb6-96k42" event={"ID":"0d602bfd-b57a-43b4-aaee-d2766cee3ec4","Type":"ContainerStarted","Data":"6c511cc2f3a87bb1b1170dde6382803aba727fa1d63471e550fdb43f3f60be8e"} Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.616212 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-5d7d68fcb6-96k42" event={"ID":"0d602bfd-b57a-43b4-aaee-d2766cee3ec4","Type":"ContainerStarted","Data":"203d9af75b775d50297145d8441c161c9eac1ac1d4620e9e4c38ef1b2c995c51"} Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.616408 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.616933 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.641228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgmk\" (UniqueName: \"kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.641705 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.642222 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.642254 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-5d7d68fcb6-96k42" podStartSLOduration=3.224210402 podStartE2EDuration="14.642234062s" podCreationTimestamp="2025-10-02 12:49:34 +0000 UTC" firstStartedPulling="2025-10-02 12:49:35.342259718 +0000 UTC m=+5975.892626082" lastFinishedPulling="2025-10-02 12:49:46.760283378 +0000 UTC m=+5987.310649742" observedRunningTime="2025-10-02 12:49:48.641153592 +0000 UTC m=+5989.191519976" watchObservedRunningTime="2025-10-02 12:49:48.642234062 +0000 UTC m=+5989.192600426" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.642383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.642700 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.665570 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgmk\" (UniqueName: \"kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk\") pod \"redhat-operators-bd8cj\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:48 crc kubenswrapper[4929]: I1002 12:49:48.797471 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:49 crc kubenswrapper[4929]: I1002 12:49:49.293395 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:49:49 crc kubenswrapper[4929]: W1002 12:49:49.295149 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode257b306_16b1_446b_98cc_0fe390d1026b.slice/crio-9ff287d3a77379cce59adfdea1b4434a74e90b20ad1187415a5ae1b1b2cbff6f WatchSource:0}: Error finding container 9ff287d3a77379cce59adfdea1b4434a74e90b20ad1187415a5ae1b1b2cbff6f: Status 404 returned error can't find the container with id 9ff287d3a77379cce59adfdea1b4434a74e90b20ad1187415a5ae1b1b2cbff6f Oct 02 12:49:49 crc kubenswrapper[4929]: I1002 12:49:49.627919 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerDied","Data":"82f6e7c8b0d69574acd2c6a4ec1bc18974ac21b1ad2e04f266da0dc5a1841379"} Oct 02 12:49:49 crc kubenswrapper[4929]: I1002 12:49:49.628062 4929 generic.go:334] "Generic (PLEG): container finished" podID="e257b306-16b1-446b-98cc-0fe390d1026b" containerID="82f6e7c8b0d69574acd2c6a4ec1bc18974ac21b1ad2e04f266da0dc5a1841379" exitCode=0 Oct 02 12:49:49 crc kubenswrapper[4929]: I1002 12:49:49.628501 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerStarted","Data":"9ff287d3a77379cce59adfdea1b4434a74e90b20ad1187415a5ae1b1b2cbff6f"} Oct 02 12:49:52 crc kubenswrapper[4929]: I1002 12:49:52.657496 4929 generic.go:334] "Generic (PLEG): container finished" podID="e257b306-16b1-446b-98cc-0fe390d1026b" containerID="e9245e21d092137e123999929759af0d9825fa27c31f6f5ebce77055d9119fe4" exitCode=0 Oct 02 12:49:52 crc kubenswrapper[4929]: I1002 12:49:52.657562 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerDied","Data":"e9245e21d092137e123999929759af0d9825fa27c31f6f5ebce77055d9119fe4"} Oct 02 12:49:54 crc kubenswrapper[4929]: I1002 12:49:54.679108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerStarted","Data":"98b30c0fe9f4c189c93d5ab48a8b5b076bf4c208a23accb49516dc70a622aeb3"} Oct 02 12:49:55 crc kubenswrapper[4929]: I1002 12:49:55.711067 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bd8cj" podStartSLOduration=3.74609417 podStartE2EDuration="7.711039227s" podCreationTimestamp="2025-10-02 12:49:48 +0000 UTC" firstStartedPulling="2025-10-02 12:49:49.628930131 +0000 UTC m=+5990.179296495" lastFinishedPulling="2025-10-02 12:49:53.593875188 +0000 UTC m=+5994.144241552" observedRunningTime="2025-10-02 12:49:55.709797692 +0000 UTC m=+5996.260164066" watchObservedRunningTime="2025-10-02 12:49:55.711039227 +0000 UTC m=+5996.261405591" Oct 02 12:49:58 crc kubenswrapper[4929]: I1002 12:49:58.797666 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:58 crc kubenswrapper[4929]: I1002 12:49:58.797743 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:49:59 crc kubenswrapper[4929]: I1002 12:49:59.843414 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:49:59 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:49:59 crc kubenswrapper[4929]: > Oct 02 12:50:02 crc kubenswrapper[4929]: I1002 12:50:02.159187 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:50:02 crc kubenswrapper[4929]: E1002 12:50:02.159848 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.363928 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-2nxzx"] Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.365795 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.370367 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.370457 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.370693 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.378582 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-2nxzx"] Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.425580 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-config-data\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.425768 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5cb9a69c-530c-408f-80fe-0027e366683f-config-data-merged\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.425809 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5cb9a69c-530c-408f-80fe-0027e366683f-hm-ports\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.425847 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-scripts\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.527505 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5cb9a69c-530c-408f-80fe-0027e366683f-config-data-merged\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.527900 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5cb9a69c-530c-408f-80fe-0027e366683f-hm-ports\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.528077 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-scripts\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.527929 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5cb9a69c-530c-408f-80fe-0027e366683f-config-data-merged\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.528259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-config-data\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.529145 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/5cb9a69c-530c-408f-80fe-0027e366683f-hm-ports\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.534329 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-config-data\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.537848 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb9a69c-530c-408f-80fe-0027e366683f-scripts\") pod \"octavia-rsyslog-2nxzx\" (UID: \"5cb9a69c-530c-408f-80fe-0027e366683f\") " pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:03 crc kubenswrapper[4929]: I1002 12:50:03.692814 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.286819 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-2nxzx"] Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.613804 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.616033 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.622769 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.625877 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.659808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.659928 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.763030 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.763308 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.763815 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.772926 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config\") pod \"octavia-image-upload-59f8cff499-fn8m4\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.792349 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-2nxzx" event={"ID":"5cb9a69c-530c-408f-80fe-0027e366683f","Type":"ContainerStarted","Data":"0d1c061041d2897dfb61052a2a21a4d233396be7f41b074adeb2c83ab96d2fa8"} Oct 02 12:50:04 crc kubenswrapper[4929]: I1002 12:50:04.947408 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:50:05 crc kubenswrapper[4929]: I1002 12:50:05.728576 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:50:05 crc kubenswrapper[4929]: W1002 12:50:05.746241 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f35f8a_714e_402d_a510_da81db3b019a.slice/crio-c30a970508c080b50d35da555aa9fb31acae48208cecf19e02bb47d0144b8c13 WatchSource:0}: Error finding container c30a970508c080b50d35da555aa9fb31acae48208cecf19e02bb47d0144b8c13: Status 404 returned error can't find the container with id c30a970508c080b50d35da555aa9fb31acae48208cecf19e02bb47d0144b8c13 Oct 02 12:50:05 crc kubenswrapper[4929]: I1002 12:50:05.830732 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerStarted","Data":"c30a970508c080b50d35da555aa9fb31acae48208cecf19e02bb47d0144b8c13"} Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.494665 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-crswf"] Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.496539 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.499266 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.505402 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-crswf"] Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.613588 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.613727 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.614034 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.614097 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.715752 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.716107 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.716149 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.716198 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.716938 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.722820 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.723253 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.725304 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts\") pod \"octavia-db-sync-crswf\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:06 crc kubenswrapper[4929]: I1002 12:50:06.836605 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:07 crc kubenswrapper[4929]: I1002 12:50:07.506727 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-crswf"] Oct 02 12:50:08 crc kubenswrapper[4929]: I1002 12:50:08.878775 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-2nxzx" event={"ID":"5cb9a69c-530c-408f-80fe-0027e366683f","Type":"ContainerStarted","Data":"11644b33a6b6ac7f27a544b5e5f89093cf2d8ef9e800faa08a2a11221c25d379"} Oct 02 12:50:09 crc kubenswrapper[4929]: I1002 12:50:09.862470 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:09 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:09 crc kubenswrapper[4929]: > Oct 02 12:50:10 crc kubenswrapper[4929]: I1002 12:50:10.992047 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:50:11 crc kubenswrapper[4929]: I1002 12:50:11.007279 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-5d7d68fcb6-96k42" Oct 02 12:50:11 crc kubenswrapper[4929]: W1002 12:50:11.158019 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62ae9256_6d03_4694_b34f_246bff68e4f7.slice/crio-41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee WatchSource:0}: Error finding container 41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee: Status 404 returned error can't find the container with id 41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee Oct 02 12:50:11 crc kubenswrapper[4929]: I1002 12:50:11.913059 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-crswf" event={"ID":"62ae9256-6d03-4694-b34f-246bff68e4f7","Type":"ContainerStarted","Data":"41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee"} Oct 02 12:50:14 crc kubenswrapper[4929]: I1002 12:50:14.162180 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:50:14 crc kubenswrapper[4929]: E1002 12:50:14.163096 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:50:14 crc kubenswrapper[4929]: I1002 12:50:14.951894 4929 generic.go:334] "Generic (PLEG): container finished" podID="5cb9a69c-530c-408f-80fe-0027e366683f" containerID="11644b33a6b6ac7f27a544b5e5f89093cf2d8ef9e800faa08a2a11221c25d379" exitCode=0 Oct 02 12:50:14 crc kubenswrapper[4929]: I1002 12:50:14.951943 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-2nxzx" event={"ID":"5cb9a69c-530c-408f-80fe-0027e366683f","Type":"ContainerDied","Data":"11644b33a6b6ac7f27a544b5e5f89093cf2d8ef9e800faa08a2a11221c25d379"} Oct 02 12:50:15 crc kubenswrapper[4929]: I1002 12:50:15.043395 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-gnrvj"] Oct 02 12:50:15 crc kubenswrapper[4929]: I1002 12:50:15.057438 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-gnrvj"] Oct 02 12:50:15 crc kubenswrapper[4929]: I1002 12:50:15.964578 4929 generic.go:334] "Generic (PLEG): container finished" podID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerID="cf967af275e2f7144520bd582eb0ba82933ea94ea5996aced98b61dc4a2650bd" exitCode=0 Oct 02 12:50:15 crc kubenswrapper[4929]: I1002 12:50:15.964629 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-crswf" event={"ID":"62ae9256-6d03-4694-b34f-246bff68e4f7","Type":"ContainerDied","Data":"cf967af275e2f7144520bd582eb0ba82933ea94ea5996aced98b61dc4a2650bd"} Oct 02 12:50:16 crc kubenswrapper[4929]: I1002 12:50:16.173575 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed25c231-e5a1-4e95-91f6-ec543494202b" path="/var/lib/kubelet/pods/ed25c231-e5a1-4e95-91f6-ec543494202b/volumes" Oct 02 12:50:17 crc kubenswrapper[4929]: I1002 12:50:17.036307 4929 scope.go:117] "RemoveContainer" containerID="9d29a1cf1c362f8b4c17aa145acfb0e868d6ae569f07f95dd6a0dbbe95172abb" Oct 02 12:50:19 crc kubenswrapper[4929]: I1002 12:50:19.846141 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:19 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:19 crc kubenswrapper[4929]: > Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.057593 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-2nxzx" event={"ID":"5cb9a69c-530c-408f-80fe-0027e366683f","Type":"ContainerStarted","Data":"44b957bc4d2c473d14a784f17d840d98e34ec463443652c0a56d4144f71511fa"} Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.058459 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.059371 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerStarted","Data":"067154614f6a8595be66605f97c55aa864e94e53225f601dc7d04a9ea36195ac"} Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.061826 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-crswf" event={"ID":"62ae9256-6d03-4694-b34f-246bff68e4f7","Type":"ContainerStarted","Data":"1ea5b93107eee9f0fbd5f852bf0e01dfb6d184746e7f01efbb830c1793f323af"} Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.102481 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-2nxzx" podStartSLOduration=1.889371546 podStartE2EDuration="21.102463927s" podCreationTimestamp="2025-10-02 12:50:03 +0000 UTC" firstStartedPulling="2025-10-02 12:50:04.284669111 +0000 UTC m=+6004.835035475" lastFinishedPulling="2025-10-02 12:50:23.497761492 +0000 UTC m=+6024.048127856" observedRunningTime="2025-10-02 12:50:24.087531324 +0000 UTC m=+6024.637897708" watchObservedRunningTime="2025-10-02 12:50:24.102463927 +0000 UTC m=+6024.652830291" Oct 02 12:50:24 crc kubenswrapper[4929]: I1002 12:50:24.102687 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-crswf" podStartSLOduration=18.102683563 podStartE2EDuration="18.102683563s" podCreationTimestamp="2025-10-02 12:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:50:24.101380276 +0000 UTC m=+6024.651746640" watchObservedRunningTime="2025-10-02 12:50:24.102683563 +0000 UTC m=+6024.653049927" Oct 02 12:50:26 crc kubenswrapper[4929]: I1002 12:50:26.034517 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-87e3-account-create-q47md"] Oct 02 12:50:26 crc kubenswrapper[4929]: I1002 12:50:26.044508 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-87e3-account-create-q47md"] Oct 02 12:50:26 crc kubenswrapper[4929]: I1002 12:50:26.167823 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12f0c57-b158-4d49-ae7f-b984511da980" path="/var/lib/kubelet/pods/e12f0c57-b158-4d49-ae7f-b984511da980/volumes" Oct 02 12:50:29 crc kubenswrapper[4929]: I1002 12:50:29.110762 4929 generic.go:334] "Generic (PLEG): container finished" podID="01f35f8a-714e-402d-a510-da81db3b019a" containerID="067154614f6a8595be66605f97c55aa864e94e53225f601dc7d04a9ea36195ac" exitCode=0 Oct 02 12:50:29 crc kubenswrapper[4929]: I1002 12:50:29.110838 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerDied","Data":"067154614f6a8595be66605f97c55aa864e94e53225f601dc7d04a9ea36195ac"} Oct 02 12:50:29 crc kubenswrapper[4929]: I1002 12:50:29.157561 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:50:29 crc kubenswrapper[4929]: E1002 12:50:29.157826 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:50:29 crc kubenswrapper[4929]: I1002 12:50:29.849569 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:29 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:29 crc kubenswrapper[4929]: > Oct 02 12:50:31 crc kubenswrapper[4929]: I1002 12:50:31.133837 4929 generic.go:334] "Generic (PLEG): container finished" podID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerID="1ea5b93107eee9f0fbd5f852bf0e01dfb6d184746e7f01efbb830c1793f323af" exitCode=0 Oct 02 12:50:31 crc kubenswrapper[4929]: I1002 12:50:31.133921 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-crswf" event={"ID":"62ae9256-6d03-4694-b34f-246bff68e4f7","Type":"ContainerDied","Data":"1ea5b93107eee9f0fbd5f852bf0e01dfb6d184746e7f01efbb830c1793f323af"} Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.146523 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerStarted","Data":"34f0c1874560053742914f10192a257dfdf966668c8ce5ab8bd053ff2904d687"} Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.193089 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" podStartSLOduration=2.493878018 podStartE2EDuration="28.193066834s" podCreationTimestamp="2025-10-02 12:50:04 +0000 UTC" firstStartedPulling="2025-10-02 12:50:05.749176247 +0000 UTC m=+6006.299542611" lastFinishedPulling="2025-10-02 12:50:31.448365063 +0000 UTC m=+6031.998731427" observedRunningTime="2025-10-02 12:50:32.161662784 +0000 UTC m=+6032.712029168" watchObservedRunningTime="2025-10-02 12:50:32.193066834 +0000 UTC m=+6032.743433198" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.545993 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.703940 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle\") pod \"62ae9256-6d03-4694-b34f-246bff68e4f7\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.704205 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts\") pod \"62ae9256-6d03-4694-b34f-246bff68e4f7\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.704223 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data\") pod \"62ae9256-6d03-4694-b34f-246bff68e4f7\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.704244 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged\") pod \"62ae9256-6d03-4694-b34f-246bff68e4f7\" (UID: \"62ae9256-6d03-4694-b34f-246bff68e4f7\") " Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.711226 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data" (OuterVolumeSpecName: "config-data") pod "62ae9256-6d03-4694-b34f-246bff68e4f7" (UID: "62ae9256-6d03-4694-b34f-246bff68e4f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.711772 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts" (OuterVolumeSpecName: "scripts") pod "62ae9256-6d03-4694-b34f-246bff68e4f7" (UID: "62ae9256-6d03-4694-b34f-246bff68e4f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.731584 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "62ae9256-6d03-4694-b34f-246bff68e4f7" (UID: "62ae9256-6d03-4694-b34f-246bff68e4f7"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.737633 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62ae9256-6d03-4694-b34f-246bff68e4f7" (UID: "62ae9256-6d03-4694-b34f-246bff68e4f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.806746 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.807166 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.807180 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ae9256-6d03-4694-b34f-246bff68e4f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:32 crc kubenswrapper[4929]: I1002 12:50:32.807192 4929 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/62ae9256-6d03-4694-b34f-246bff68e4f7-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:33 crc kubenswrapper[4929]: I1002 12:50:33.162910 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-crswf" event={"ID":"62ae9256-6d03-4694-b34f-246bff68e4f7","Type":"ContainerDied","Data":"41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee"} Oct 02 12:50:33 crc kubenswrapper[4929]: I1002 12:50:33.162993 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41c922bb7972895ae088fcec52c7116da6862cdf706933ac08984c566d8423ee" Oct 02 12:50:33 crc kubenswrapper[4929]: I1002 12:50:33.163106 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-crswf" Oct 02 12:50:33 crc kubenswrapper[4929]: I1002 12:50:33.721524 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-2nxzx" Oct 02 12:50:38 crc kubenswrapper[4929]: I1002 12:50:38.039017 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-knkjw"] Oct 02 12:50:38 crc kubenswrapper[4929]: I1002 12:50:38.048468 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-knkjw"] Oct 02 12:50:38 crc kubenswrapper[4929]: I1002 12:50:38.169736 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12e4480-5f68-430b-af5a-e3c955a3b006" path="/var/lib/kubelet/pods/b12e4480-5f68-430b-af5a-e3c955a3b006/volumes" Oct 02 12:50:39 crc kubenswrapper[4929]: I1002 12:50:39.867689 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:39 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:39 crc kubenswrapper[4929]: > Oct 02 12:50:42 crc kubenswrapper[4929]: I1002 12:50:42.157620 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:50:42 crc kubenswrapper[4929]: E1002 12:50:42.158260 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:50:49 crc kubenswrapper[4929]: I1002 12:50:49.843310 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:49 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:49 crc kubenswrapper[4929]: > Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.156438 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:50:54 crc kubenswrapper[4929]: E1002 12:50:54.157196 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.336728 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:50:54 crc kubenswrapper[4929]: E1002 12:50:54.337197 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerName="octavia-db-sync" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.337213 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerName="octavia-db-sync" Oct 02 12:50:54 crc kubenswrapper[4929]: E1002 12:50:54.337256 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerName="init" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.337262 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerName="init" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.337444 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" containerName="octavia-db-sync" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.338825 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.345482 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.427732 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.427878 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.427934 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttb6l\" (UniqueName: \"kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.529587 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.529636 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttb6l\" (UniqueName: \"kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.529826 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.530161 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.530252 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.551048 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttb6l\" (UniqueName: \"kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l\") pod \"certified-operators-v77r2\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:54 crc kubenswrapper[4929]: I1002 12:50:54.658119 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:50:55 crc kubenswrapper[4929]: I1002 12:50:55.203927 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:50:55 crc kubenswrapper[4929]: I1002 12:50:55.383181 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerStarted","Data":"3d0c555d5aecb29f3f69730210288a13233d0ec14c75de0dafd57d859cefffbd"} Oct 02 12:50:56 crc kubenswrapper[4929]: I1002 12:50:56.392894 4929 generic.go:334] "Generic (PLEG): container finished" podID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerID="c2b91ec9bc2d822c35bc3d6f6f09d327c9ca3b27449d1ebecadf4990701c82e2" exitCode=0 Oct 02 12:50:56 crc kubenswrapper[4929]: I1002 12:50:56.392967 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerDied","Data":"c2b91ec9bc2d822c35bc3d6f6f09d327c9ca3b27449d1ebecadf4990701c82e2"} Oct 02 12:50:59 crc kubenswrapper[4929]: I1002 12:50:59.424497 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerStarted","Data":"9ea14558c3f165b917cace17adb5236440682c31d1897dac8af58a48dc483b51"} Oct 02 12:50:59 crc kubenswrapper[4929]: I1002 12:50:59.845569 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:50:59 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:50:59 crc kubenswrapper[4929]: > Oct 02 12:51:05 crc kubenswrapper[4929]: I1002 12:51:05.156562 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:51:05 crc kubenswrapper[4929]: E1002 12:51:05.157266 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:51:06 crc kubenswrapper[4929]: I1002 12:51:06.497845 4929 generic.go:334] "Generic (PLEG): container finished" podID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerID="9ea14558c3f165b917cace17adb5236440682c31d1897dac8af58a48dc483b51" exitCode=0 Oct 02 12:51:06 crc kubenswrapper[4929]: I1002 12:51:06.497998 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerDied","Data":"9ea14558c3f165b917cace17adb5236440682c31d1897dac8af58a48dc483b51"} Oct 02 12:51:07 crc kubenswrapper[4929]: I1002 12:51:07.903593 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:51:07 crc kubenswrapper[4929]: I1002 12:51:07.904135 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="octavia-amphora-httpd" containerID="cri-o://34f0c1874560053742914f10192a257dfdf966668c8ce5ab8bd053ff2904d687" gracePeriod=30 Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.521639 4929 generic.go:334] "Generic (PLEG): container finished" podID="01f35f8a-714e-402d-a510-da81db3b019a" containerID="34f0c1874560053742914f10192a257dfdf966668c8ce5ab8bd053ff2904d687" exitCode=0 Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.521749 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerDied","Data":"34f0c1874560053742914f10192a257dfdf966668c8ce5ab8bd053ff2904d687"} Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.530901 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerStarted","Data":"e90fc00d0ae3fdaf3e25881cae8481d583dc7b6dc19611d4dc4073633ea180ef"} Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.572626 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v77r2" podStartSLOduration=3.510584235 podStartE2EDuration="14.572592908s" podCreationTimestamp="2025-10-02 12:50:54 +0000 UTC" firstStartedPulling="2025-10-02 12:50:56.39517027 +0000 UTC m=+6056.945536644" lastFinishedPulling="2025-10-02 12:51:07.457178953 +0000 UTC m=+6068.007545317" observedRunningTime="2025-10-02 12:51:08.549078922 +0000 UTC m=+6069.099445296" watchObservedRunningTime="2025-10-02 12:51:08.572592908 +0000 UTC m=+6069.122959272" Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.654244 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.713142 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image\") pod \"01f35f8a-714e-402d-a510-da81db3b019a\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.713646 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config\") pod \"01f35f8a-714e-402d-a510-da81db3b019a\" (UID: \"01f35f8a-714e-402d-a510-da81db3b019a\") " Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.759835 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "01f35f8a-714e-402d-a510-da81db3b019a" (UID: "01f35f8a-714e-402d-a510-da81db3b019a"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.765056 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "01f35f8a-714e-402d-a510-da81db3b019a" (UID: "01f35f8a-714e-402d-a510-da81db3b019a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.816681 4929 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/01f35f8a-714e-402d-a510-da81db3b019a-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:08 crc kubenswrapper[4929]: I1002 12:51:08.816723 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/01f35f8a-714e-402d-a510-da81db3b019a-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.039035 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-v687k"] Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.051918 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-v687k"] Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.543182 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" event={"ID":"01f35f8a-714e-402d-a510-da81db3b019a","Type":"ContainerDied","Data":"c30a970508c080b50d35da555aa9fb31acae48208cecf19e02bb47d0144b8c13"} Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.543256 4929 scope.go:117] "RemoveContainer" containerID="34f0c1874560053742914f10192a257dfdf966668c8ce5ab8bd053ff2904d687" Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.543620 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-fn8m4" Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.571700 4929 scope.go:117] "RemoveContainer" containerID="067154614f6a8595be66605f97c55aa864e94e53225f601dc7d04a9ea36195ac" Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.579431 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.591716 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-fn8m4"] Oct 02 12:51:09 crc kubenswrapper[4929]: I1002 12:51:09.857499 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:51:09 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:51:09 crc kubenswrapper[4929]: > Oct 02 12:51:10 crc kubenswrapper[4929]: I1002 12:51:10.169617 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f35f8a-714e-402d-a510-da81db3b019a" path="/var/lib/kubelet/pods/01f35f8a-714e-402d-a510-da81db3b019a/volumes" Oct 02 12:51:10 crc kubenswrapper[4929]: I1002 12:51:10.170270 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2a7fe3-11b1-412d-a810-e24dbf0d7656" path="/var/lib/kubelet/pods/bf2a7fe3-11b1-412d-a810-e24dbf0d7656/volumes" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.538186 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-7xfvd"] Oct 02 12:51:11 crc kubenswrapper[4929]: E1002 12:51:11.538929 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="octavia-amphora-httpd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.538944 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="octavia-amphora-httpd" Oct 02 12:51:11 crc kubenswrapper[4929]: E1002 12:51:11.538985 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="init" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.538994 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="init" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.539177 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f35f8a-714e-402d-a510-da81db3b019a" containerName="octavia-amphora-httpd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.540508 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.543012 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.543646 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.545396 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.550879 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-7xfvd"] Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570200 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-scripts\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570411 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data-merged\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570467 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-amphora-certs\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570528 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570568 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-combined-ca-bundle\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.570598 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab541ae0-0e64-4fc9-9f78-cb330e890126-hm-ports\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.672913 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab541ae0-0e64-4fc9-9f78-cb330e890126-hm-ports\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.673012 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-scripts\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.673176 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data-merged\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.673221 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-amphora-certs\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.673257 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.673285 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-combined-ca-bundle\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.674313 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data-merged\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.675562 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab541ae0-0e64-4fc9-9f78-cb330e890126-hm-ports\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.679450 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-combined-ca-bundle\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.680809 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-scripts\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.681266 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-config-data\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.694600 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab541ae0-0e64-4fc9-9f78-cb330e890126-amphora-certs\") pod \"octavia-healthmanager-7xfvd\" (UID: \"ab541ae0-0e64-4fc9-9f78-cb330e890126\") " pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:11 crc kubenswrapper[4929]: I1002 12:51:11.860655 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:12 crc kubenswrapper[4929]: I1002 12:51:12.409489 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-7xfvd"] Oct 02 12:51:12 crc kubenswrapper[4929]: I1002 12:51:12.585083 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-7xfvd" event={"ID":"ab541ae0-0e64-4fc9-9f78-cb330e890126","Type":"ContainerStarted","Data":"889d1ff155de039bf797f4253dc1e8c9e7f391f9cc3676ff757677f890119adc"} Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.206549 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-7246t"] Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.208948 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.211635 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.216140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/87fc42ed-8ef3-42be-8dc9-0d8e62925951-httpd-config\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.216204 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/87fc42ed-8ef3-42be-8dc9-0d8e62925951-amphora-image\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.219437 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-7246t"] Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.319147 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/87fc42ed-8ef3-42be-8dc9-0d8e62925951-httpd-config\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.319209 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/87fc42ed-8ef3-42be-8dc9-0d8e62925951-amphora-image\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.321496 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/87fc42ed-8ef3-42be-8dc9-0d8e62925951-amphora-image\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.326141 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/87fc42ed-8ef3-42be-8dc9-0d8e62925951-httpd-config\") pod \"octavia-image-upload-59f8cff499-7246t\" (UID: \"87fc42ed-8ef3-42be-8dc9-0d8e62925951\") " pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.529935 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-7246t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.607796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-7xfvd" event={"ID":"ab541ae0-0e64-4fc9-9f78-cb330e890126","Type":"ContainerStarted","Data":"aebabbb0976c5c6197851b3e7a0db80fa7eb81c89544f37c4220ac05c78eed60"} Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.946068 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-qtk4t"] Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.948128 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.950781 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.951979 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 02 12:51:13 crc kubenswrapper[4929]: I1002 12:51:13.966623 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qtk4t"] Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.013399 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-7246t"] Oct 02 12:51:14 crc kubenswrapper[4929]: W1002 12:51:14.025824 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87fc42ed_8ef3_42be_8dc9_0d8e62925951.slice/crio-928e0c9dfbdd8296d57edbfc490b5646e77264638fd8aff98dd3104c92ba4421 WatchSource:0}: Error finding container 928e0c9dfbdd8296d57edbfc490b5646e77264638fd8aff98dd3104c92ba4421: Status 404 returned error can't find the container with id 928e0c9dfbdd8296d57edbfc490b5646e77264638fd8aff98dd3104c92ba4421 Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.030576 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-scripts\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.030758 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-config-data\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.030901 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-combined-ca-bundle\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.031028 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-amphora-certs\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.031157 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfdcc6b1-932c-4139-bc67-155977759446-hm-ports\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.031247 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfdcc6b1-932c-4139-bc67-155977759446-config-data-merged\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135541 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-scripts\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135620 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-config-data\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135693 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-combined-ca-bundle\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135739 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-amphora-certs\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135803 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfdcc6b1-932c-4139-bc67-155977759446-hm-ports\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.135831 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfdcc6b1-932c-4139-bc67-155977759446-config-data-merged\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.136323 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfdcc6b1-932c-4139-bc67-155977759446-config-data-merged\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.143644 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfdcc6b1-932c-4139-bc67-155977759446-hm-ports\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.144479 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-scripts\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.145340 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-config-data\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.147396 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-amphora-certs\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.149199 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdcc6b1-932c-4139-bc67-155977759446-combined-ca-bundle\") pod \"octavia-housekeeping-qtk4t\" (UID: \"cfdcc6b1-932c-4139-bc67-155977759446\") " pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.282439 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.620584 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-7246t" event={"ID":"87fc42ed-8ef3-42be-8dc9-0d8e62925951","Type":"ContainerStarted","Data":"928e0c9dfbdd8296d57edbfc490b5646e77264638fd8aff98dd3104c92ba4421"} Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.658739 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.658855 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.714097 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:14 crc kubenswrapper[4929]: I1002 12:51:14.888626 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-qtk4t"] Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.631999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-7246t" event={"ID":"87fc42ed-8ef3-42be-8dc9-0d8e62925951","Type":"ContainerStarted","Data":"34e880c93fc118db61a32ea6dc9b7d5d917e08c0156faeaf6f42bcc20112aeca"} Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.633527 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qtk4t" event={"ID":"cfdcc6b1-932c-4139-bc67-155977759446","Type":"ContainerStarted","Data":"b4d5a2df6c69203b96df341d41a5fe78da29d00b9b700eb2072ef7d722b3e2e3"} Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.635600 4929 generic.go:334] "Generic (PLEG): container finished" podID="ab541ae0-0e64-4fc9-9f78-cb330e890126" containerID="aebabbb0976c5c6197851b3e7a0db80fa7eb81c89544f37c4220ac05c78eed60" exitCode=0 Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.635696 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-7xfvd" event={"ID":"ab541ae0-0e64-4fc9-9f78-cb330e890126","Type":"ContainerDied","Data":"aebabbb0976c5c6197851b3e7a0db80fa7eb81c89544f37c4220ac05c78eed60"} Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.700339 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:15 crc kubenswrapper[4929]: I1002 12:51:15.755116 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:51:16 crc kubenswrapper[4929]: I1002 12:51:16.663041 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-7xfvd" event={"ID":"ab541ae0-0e64-4fc9-9f78-cb330e890126","Type":"ContainerStarted","Data":"142b0fb1de5167059b59f96cc26feff37b7b150af4014d1a9f7a286e82ebd1f7"} Oct 02 12:51:16 crc kubenswrapper[4929]: I1002 12:51:16.664615 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:16 crc kubenswrapper[4929]: I1002 12:51:16.695823 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-7xfvd" podStartSLOduration=5.695794579 podStartE2EDuration="5.695794579s" podCreationTimestamp="2025-10-02 12:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:51:16.684670973 +0000 UTC m=+6077.235037337" watchObservedRunningTime="2025-10-02 12:51:16.695794579 +0000 UTC m=+6077.246160943" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.156578 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:51:17 crc kubenswrapper[4929]: E1002 12:51:17.156977 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.278590 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-8bpgx"] Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.280715 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.282475 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.284583 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.288581 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8bpgx"] Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.407971 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3580a8ac-ecc4-46d6-8e76-4c49e5341380-hm-ports\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.408119 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-scripts\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.408354 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.408410 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-combined-ca-bundle\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.408649 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-amphora-certs\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.408826 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data-merged\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.510952 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3580a8ac-ecc4-46d6-8e76-4c49e5341380-hm-ports\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511031 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-scripts\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511084 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511113 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-combined-ca-bundle\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511166 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-amphora-certs\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511216 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data-merged\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.511716 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data-merged\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.512136 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3580a8ac-ecc4-46d6-8e76-4c49e5341380-hm-ports\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.517719 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-combined-ca-bundle\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.518259 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-amphora-certs\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.518602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-config-data\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.521451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3580a8ac-ecc4-46d6-8e76-4c49e5341380-scripts\") pod \"octavia-worker-8bpgx\" (UID: \"3580a8ac-ecc4-46d6-8e76-4c49e5341380\") " pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.612773 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:17 crc kubenswrapper[4929]: I1002 12:51:17.671224 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v77r2" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="registry-server" containerID="cri-o://e90fc00d0ae3fdaf3e25881cae8481d583dc7b6dc19611d4dc4073633ea180ef" gracePeriod=2 Oct 02 12:51:18 crc kubenswrapper[4929]: W1002 12:51:18.263879 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3580a8ac_ecc4_46d6_8e76_4c49e5341380.slice/crio-4ce71ea239be2526b4a5afa437f8e91052628a88ca5ba350b76a67c691aa60d8 WatchSource:0}: Error finding container 4ce71ea239be2526b4a5afa437f8e91052628a88ca5ba350b76a67c691aa60d8: Status 404 returned error can't find the container with id 4ce71ea239be2526b4a5afa437f8e91052628a88ca5ba350b76a67c691aa60d8 Oct 02 12:51:18 crc kubenswrapper[4929]: I1002 12:51:18.269092 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8bpgx"] Oct 02 12:51:18 crc kubenswrapper[4929]: I1002 12:51:18.682426 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8bpgx" event={"ID":"3580a8ac-ecc4-46d6-8e76-4c49e5341380","Type":"ContainerStarted","Data":"4ce71ea239be2526b4a5afa437f8e91052628a88ca5ba350b76a67c691aa60d8"} Oct 02 12:51:18 crc kubenswrapper[4929]: I1002 12:51:18.857475 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:51:18 crc kubenswrapper[4929]: I1002 12:51:18.911381 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:51:19 crc kubenswrapper[4929]: I1002 12:51:19.032937 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-89fe-account-create-xvbc2"] Oct 02 12:51:19 crc kubenswrapper[4929]: I1002 12:51:19.043470 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-89fe-account-create-xvbc2"] Oct 02 12:51:19 crc kubenswrapper[4929]: I1002 12:51:19.693976 4929 generic.go:334] "Generic (PLEG): container finished" podID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerID="e90fc00d0ae3fdaf3e25881cae8481d583dc7b6dc19611d4dc4073633ea180ef" exitCode=0 Oct 02 12:51:19 crc kubenswrapper[4929]: I1002 12:51:19.693998 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerDied","Data":"e90fc00d0ae3fdaf3e25881cae8481d583dc7b6dc19611d4dc4073633ea180ef"} Oct 02 12:51:19 crc kubenswrapper[4929]: I1002 12:51:19.725666 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:51:20 crc kubenswrapper[4929]: I1002 12:51:20.169327 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a6695bb-e510-4974-94c6-9f18ebe5af7e" path="/var/lib/kubelet/pods/6a6695bb-e510-4974-94c6-9f18ebe5af7e/volumes" Oct 02 12:51:20 crc kubenswrapper[4929]: I1002 12:51:20.715147 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bd8cj" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" containerID="cri-o://98b30c0fe9f4c189c93d5ab48a8b5b076bf4c208a23accb49516dc70a622aeb3" gracePeriod=2 Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.177231 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.288068 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content\") pod \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.288349 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttb6l\" (UniqueName: \"kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l\") pod \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.288574 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities\") pod \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\" (UID: \"12bb8929-b790-4096-80b3-cb3a0c5ed2b4\") " Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.289473 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities" (OuterVolumeSpecName: "utilities") pod "12bb8929-b790-4096-80b3-cb3a0c5ed2b4" (UID: "12bb8929-b790-4096-80b3-cb3a0c5ed2b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.290845 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.296264 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l" (OuterVolumeSpecName: "kube-api-access-ttb6l") pod "12bb8929-b790-4096-80b3-cb3a0c5ed2b4" (UID: "12bb8929-b790-4096-80b3-cb3a0c5ed2b4"). InnerVolumeSpecName "kube-api-access-ttb6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.322766 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12bb8929-b790-4096-80b3-cb3a0c5ed2b4" (UID: "12bb8929-b790-4096-80b3-cb3a0c5ed2b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.392720 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttb6l\" (UniqueName: \"kubernetes.io/projected/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-kube-api-access-ttb6l\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.392755 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12bb8929-b790-4096-80b3-cb3a0c5ed2b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.728451 4929 generic.go:334] "Generic (PLEG): container finished" podID="e257b306-16b1-446b-98cc-0fe390d1026b" containerID="98b30c0fe9f4c189c93d5ab48a8b5b076bf4c208a23accb49516dc70a622aeb3" exitCode=0 Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.728543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerDied","Data":"98b30c0fe9f4c189c93d5ab48a8b5b076bf4c208a23accb49516dc70a622aeb3"} Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.731908 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v77r2" event={"ID":"12bb8929-b790-4096-80b3-cb3a0c5ed2b4","Type":"ContainerDied","Data":"3d0c555d5aecb29f3f69730210288a13233d0ec14c75de0dafd57d859cefffbd"} Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.732008 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v77r2" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.732021 4929 scope.go:117] "RemoveContainer" containerID="e90fc00d0ae3fdaf3e25881cae8481d583dc7b6dc19611d4dc4073633ea180ef" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.754535 4929 scope.go:117] "RemoveContainer" containerID="9ea14558c3f165b917cace17adb5236440682c31d1897dac8af58a48dc483b51" Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.769524 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.777396 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v77r2"] Oct 02 12:51:21 crc kubenswrapper[4929]: I1002 12:51:21.804482 4929 scope.go:117] "RemoveContainer" containerID="c2b91ec9bc2d822c35bc3d6f6f09d327c9ca3b27449d1ebecadf4990701c82e2" Oct 02 12:51:22 crc kubenswrapper[4929]: I1002 12:51:22.172986 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" path="/var/lib/kubelet/pods/12bb8929-b790-4096-80b3-cb3a0c5ed2b4/volumes" Oct 02 12:51:22 crc kubenswrapper[4929]: I1002 12:51:22.946324 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.132013 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities\") pod \"e257b306-16b1-446b-98cc-0fe390d1026b\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.132231 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content\") pod \"e257b306-16b1-446b-98cc-0fe390d1026b\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.132499 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lgmk\" (UniqueName: \"kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk\") pod \"e257b306-16b1-446b-98cc-0fe390d1026b\" (UID: \"e257b306-16b1-446b-98cc-0fe390d1026b\") " Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.132606 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities" (OuterVolumeSpecName: "utilities") pod "e257b306-16b1-446b-98cc-0fe390d1026b" (UID: "e257b306-16b1-446b-98cc-0fe390d1026b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.133460 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.139559 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk" (OuterVolumeSpecName: "kube-api-access-5lgmk") pod "e257b306-16b1-446b-98cc-0fe390d1026b" (UID: "e257b306-16b1-446b-98cc-0fe390d1026b"). InnerVolumeSpecName "kube-api-access-5lgmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.210927 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e257b306-16b1-446b-98cc-0fe390d1026b" (UID: "e257b306-16b1-446b-98cc-0fe390d1026b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.235469 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e257b306-16b1-446b-98cc-0fe390d1026b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.235518 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lgmk\" (UniqueName: \"kubernetes.io/projected/e257b306-16b1-446b-98cc-0fe390d1026b-kube-api-access-5lgmk\") on node \"crc\" DevicePath \"\"" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.510337 4929 scope.go:117] "RemoveContainer" containerID="597660d037f8f36df21b9bbbad970f76560cda6b5dc965a6f5dce3503af2d3b5" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.664116 4929 scope.go:117] "RemoveContainer" containerID="6876c249d2f639ad5f545507923c0f54fae7595929e6b936beb224aa6b535ec4" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.765033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bd8cj" event={"ID":"e257b306-16b1-446b-98cc-0fe390d1026b","Type":"ContainerDied","Data":"9ff287d3a77379cce59adfdea1b4434a74e90b20ad1187415a5ae1b1b2cbff6f"} Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.765114 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bd8cj" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.765122 4929 scope.go:117] "RemoveContainer" containerID="98b30c0fe9f4c189c93d5ab48a8b5b076bf4c208a23accb49516dc70a622aeb3" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.789139 4929 scope.go:117] "RemoveContainer" containerID="6d7fb57df0b2ded2b8c3068839405a760768eb97113eda8bd4017ceae45306c1" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.811765 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.819021 4929 scope.go:117] "RemoveContainer" containerID="e9245e21d092137e123999929759af0d9825fa27c31f6f5ebce77055d9119fe4" Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.822154 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bd8cj"] Oct 02 12:51:23 crc kubenswrapper[4929]: I1002 12:51:23.884777 4929 scope.go:117] "RemoveContainer" containerID="3cece3341682b00d935b88b232a24bc769ee27afc0d2cef4c55f91b5984e377a" Oct 02 12:51:24 crc kubenswrapper[4929]: I1002 12:51:24.068352 4929 scope.go:117] "RemoveContainer" containerID="82f6e7c8b0d69574acd2c6a4ec1bc18974ac21b1ad2e04f266da0dc5a1841379" Oct 02 12:51:24 crc kubenswrapper[4929]: I1002 12:51:24.170487 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" path="/var/lib/kubelet/pods/e257b306-16b1-446b-98cc-0fe390d1026b/volumes" Oct 02 12:51:26 crc kubenswrapper[4929]: I1002 12:51:26.888800 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-7xfvd" Oct 02 12:51:27 crc kubenswrapper[4929]: I1002 12:51:27.041342 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-k7z8p"] Oct 02 12:51:27 crc kubenswrapper[4929]: I1002 12:51:27.050289 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-k7z8p"] Oct 02 12:51:28 crc kubenswrapper[4929]: I1002 12:51:28.157603 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:51:28 crc kubenswrapper[4929]: E1002 12:51:28.158035 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:51:28 crc kubenswrapper[4929]: I1002 12:51:28.168941 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8" path="/var/lib/kubelet/pods/01c87b1d-4dc5-47dc-90fb-61ed0e3c57c8/volumes" Oct 02 12:51:28 crc kubenswrapper[4929]: I1002 12:51:28.818657 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qtk4t" event={"ID":"cfdcc6b1-932c-4139-bc67-155977759446","Type":"ContainerStarted","Data":"5de2be5d7185e265ecde2e94d46e1c54a8ec7c11707fbda204981b16e98d299a"} Oct 02 12:51:29 crc kubenswrapper[4929]: I1002 12:51:29.830293 4929 generic.go:334] "Generic (PLEG): container finished" podID="87fc42ed-8ef3-42be-8dc9-0d8e62925951" containerID="34e880c93fc118db61a32ea6dc9b7d5d917e08c0156faeaf6f42bcc20112aeca" exitCode=0 Oct 02 12:51:29 crc kubenswrapper[4929]: I1002 12:51:29.830391 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-7246t" event={"ID":"87fc42ed-8ef3-42be-8dc9-0d8e62925951","Type":"ContainerDied","Data":"34e880c93fc118db61a32ea6dc9b7d5d917e08c0156faeaf6f42bcc20112aeca"} Oct 02 12:51:29 crc kubenswrapper[4929]: I1002 12:51:29.832667 4929 generic.go:334] "Generic (PLEG): container finished" podID="cfdcc6b1-932c-4139-bc67-155977759446" containerID="5de2be5d7185e265ecde2e94d46e1c54a8ec7c11707fbda204981b16e98d299a" exitCode=0 Oct 02 12:51:29 crc kubenswrapper[4929]: I1002 12:51:29.832702 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qtk4t" event={"ID":"cfdcc6b1-932c-4139-bc67-155977759446","Type":"ContainerDied","Data":"5de2be5d7185e265ecde2e94d46e1c54a8ec7c11707fbda204981b16e98d299a"} Oct 02 12:51:32 crc kubenswrapper[4929]: I1002 12:51:32.865501 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-qtk4t" event={"ID":"cfdcc6b1-932c-4139-bc67-155977759446","Type":"ContainerStarted","Data":"5bb771a339fd5bdf692004551618d399de12b869a7b5c04b33b33688f3c580a9"} Oct 02 12:51:32 crc kubenswrapper[4929]: I1002 12:51:32.867165 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:32 crc kubenswrapper[4929]: I1002 12:51:32.894557 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-qtk4t" podStartSLOduration=7.426380376 podStartE2EDuration="19.89453526s" podCreationTimestamp="2025-10-02 12:51:13 +0000 UTC" firstStartedPulling="2025-10-02 12:51:14.906123099 +0000 UTC m=+6075.456489473" lastFinishedPulling="2025-10-02 12:51:27.374277993 +0000 UTC m=+6087.924644357" observedRunningTime="2025-10-02 12:51:32.890890317 +0000 UTC m=+6093.441256681" watchObservedRunningTime="2025-10-02 12:51:32.89453526 +0000 UTC m=+6093.444901624" Oct 02 12:51:33 crc kubenswrapper[4929]: I1002 12:51:33.878231 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8bpgx" event={"ID":"3580a8ac-ecc4-46d6-8e76-4c49e5341380","Type":"ContainerStarted","Data":"aa1baaea1743636b92a7495a039d8519ddf34454e78016287a9725f04bb8db79"} Oct 02 12:51:34 crc kubenswrapper[4929]: E1002 12:51:34.084977 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3580a8ac_ecc4_46d6_8e76_4c49e5341380.slice/crio-aa1baaea1743636b92a7495a039d8519ddf34454e78016287a9725f04bb8db79.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:51:34 crc kubenswrapper[4929]: I1002 12:51:34.890322 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-7246t" event={"ID":"87fc42ed-8ef3-42be-8dc9-0d8e62925951","Type":"ContainerStarted","Data":"c4f542ca897e79426263cc5c400bf62525914ae05578924be1ead3ee8dc73fb1"} Oct 02 12:51:34 crc kubenswrapper[4929]: I1002 12:51:34.893034 4929 generic.go:334] "Generic (PLEG): container finished" podID="3580a8ac-ecc4-46d6-8e76-4c49e5341380" containerID="aa1baaea1743636b92a7495a039d8519ddf34454e78016287a9725f04bb8db79" exitCode=0 Oct 02 12:51:34 crc kubenswrapper[4929]: I1002 12:51:34.893085 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8bpgx" event={"ID":"3580a8ac-ecc4-46d6-8e76-4c49e5341380","Type":"ContainerDied","Data":"aa1baaea1743636b92a7495a039d8519ddf34454e78016287a9725f04bb8db79"} Oct 02 12:51:34 crc kubenswrapper[4929]: I1002 12:51:34.924096 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-7246t" podStartSLOduration=1.520098939 podStartE2EDuration="21.924030455s" podCreationTimestamp="2025-10-02 12:51:13 +0000 UTC" firstStartedPulling="2025-10-02 12:51:14.034598854 +0000 UTC m=+6074.584965218" lastFinishedPulling="2025-10-02 12:51:34.43853037 +0000 UTC m=+6094.988896734" observedRunningTime="2025-10-02 12:51:34.905324165 +0000 UTC m=+6095.455690529" watchObservedRunningTime="2025-10-02 12:51:34.924030455 +0000 UTC m=+6095.474396819" Oct 02 12:51:35 crc kubenswrapper[4929]: I1002 12:51:35.905879 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8bpgx" event={"ID":"3580a8ac-ecc4-46d6-8e76-4c49e5341380","Type":"ContainerStarted","Data":"165fbbfc9d48678febe2e9e994fbc0272ac786e7ce1a0a3ca2b9eedea9ab5198"} Oct 02 12:51:35 crc kubenswrapper[4929]: I1002 12:51:35.906399 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:35 crc kubenswrapper[4929]: I1002 12:51:35.967981 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-8bpgx" podStartSLOduration=4.813255923 podStartE2EDuration="18.967939225s" podCreationTimestamp="2025-10-02 12:51:17 +0000 UTC" firstStartedPulling="2025-10-02 12:51:18.267197075 +0000 UTC m=+6078.817563439" lastFinishedPulling="2025-10-02 12:51:32.421880377 +0000 UTC m=+6092.972246741" observedRunningTime="2025-10-02 12:51:35.963284183 +0000 UTC m=+6096.513650557" watchObservedRunningTime="2025-10-02 12:51:35.967939225 +0000 UTC m=+6096.518305589" Oct 02 12:51:39 crc kubenswrapper[4929]: I1002 12:51:39.157380 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:51:39 crc kubenswrapper[4929]: E1002 12:51:39.158085 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:51:44 crc kubenswrapper[4929]: I1002 12:51:44.314433 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-qtk4t" Oct 02 12:51:47 crc kubenswrapper[4929]: I1002 12:51:47.643881 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-8bpgx" Oct 02 12:51:51 crc kubenswrapper[4929]: I1002 12:51:51.156542 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:51:51 crc kubenswrapper[4929]: E1002 12:51:51.157259 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464236 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464843 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="extract-utilities" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464855 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="extract-utilities" Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464870 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="extract-content" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464876 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="extract-content" Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464889 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="extract-utilities" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464895 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="extract-utilities" Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464912 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464918 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464939 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="extract-content" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464945 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="extract-content" Oct 02 12:51:52 crc kubenswrapper[4929]: E1002 12:51:52.464974 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.464980 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.465153 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e257b306-16b1-446b-98cc-0fe390d1026b" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.465166 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bb8929-b790-4096-80b3-cb3a0c5ed2b4" containerName="registry-server" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.466512 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.479981 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.535534 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.535637 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tc6\" (UniqueName: \"kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.535791 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.637208 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tc6\" (UniqueName: \"kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.637422 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.637497 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.637926 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.638011 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.665236 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tc6\" (UniqueName: \"kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6\") pod \"community-operators-wsp5t\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:52 crc kubenswrapper[4929]: I1002 12:51:52.810175 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:51:53 crc kubenswrapper[4929]: I1002 12:51:53.354251 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:51:54 crc kubenswrapper[4929]: I1002 12:51:54.109353 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6ad62de-992d-48ae-a960-1c411ada3421" containerID="406e274bb44f3773aa7084033bed47a73a8f7e382cb74635b9a8833aaf2e9798" exitCode=0 Oct 02 12:51:54 crc kubenswrapper[4929]: I1002 12:51:54.109412 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerDied","Data":"406e274bb44f3773aa7084033bed47a73a8f7e382cb74635b9a8833aaf2e9798"} Oct 02 12:51:54 crc kubenswrapper[4929]: I1002 12:51:54.109686 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerStarted","Data":"95299355a023491313680cdc9e422d394472cb04cfac9ecad74e68a66cc68caf"} Oct 02 12:51:54 crc kubenswrapper[4929]: I1002 12:51:54.111862 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:51:56 crc kubenswrapper[4929]: I1002 12:51:56.140391 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6ad62de-992d-48ae-a960-1c411ada3421" containerID="6639cbe2e5fc62dccad474c00066a59e1d34fe4e5f6403cb141284844c8a2005" exitCode=0 Oct 02 12:51:56 crc kubenswrapper[4929]: I1002 12:51:56.140453 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerDied","Data":"6639cbe2e5fc62dccad474c00066a59e1d34fe4e5f6403cb141284844c8a2005"} Oct 02 12:51:58 crc kubenswrapper[4929]: I1002 12:51:58.169202 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerStarted","Data":"677056ba74b981ee26a6d0a8bc43cf48748f38eb2b777621142e7575b9d0b728"} Oct 02 12:51:58 crc kubenswrapper[4929]: I1002 12:51:58.191655 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wsp5t" podStartSLOduration=3.100809364 podStartE2EDuration="6.191635512s" podCreationTimestamp="2025-10-02 12:51:52 +0000 UTC" firstStartedPulling="2025-10-02 12:51:54.111468442 +0000 UTC m=+6114.661834806" lastFinishedPulling="2025-10-02 12:51:57.20229459 +0000 UTC m=+6117.752660954" observedRunningTime="2025-10-02 12:51:58.181620188 +0000 UTC m=+6118.731986572" watchObservedRunningTime="2025-10-02 12:51:58.191635512 +0000 UTC m=+6118.742001876" Oct 02 12:52:02 crc kubenswrapper[4929]: I1002 12:52:02.810802 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:02 crc kubenswrapper[4929]: I1002 12:52:02.811113 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:02 crc kubenswrapper[4929]: I1002 12:52:02.864525 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:03 crc kubenswrapper[4929]: I1002 12:52:03.250641 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:03 crc kubenswrapper[4929]: I1002 12:52:03.307727 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:52:05 crc kubenswrapper[4929]: I1002 12:52:05.248301 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wsp5t" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="registry-server" containerID="cri-o://677056ba74b981ee26a6d0a8bc43cf48748f38eb2b777621142e7575b9d0b728" gracePeriod=2 Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.159155 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:52:06 crc kubenswrapper[4929]: E1002 12:52:06.159725 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.260804 4929 generic.go:334] "Generic (PLEG): container finished" podID="f6ad62de-992d-48ae-a960-1c411ada3421" containerID="677056ba74b981ee26a6d0a8bc43cf48748f38eb2b777621142e7575b9d0b728" exitCode=0 Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.260843 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerDied","Data":"677056ba74b981ee26a6d0a8bc43cf48748f38eb2b777621142e7575b9d0b728"} Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.401267 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.504940 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content\") pod \"f6ad62de-992d-48ae-a960-1c411ada3421\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.505156 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9tc6\" (UniqueName: \"kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6\") pod \"f6ad62de-992d-48ae-a960-1c411ada3421\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.505266 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities\") pod \"f6ad62de-992d-48ae-a960-1c411ada3421\" (UID: \"f6ad62de-992d-48ae-a960-1c411ada3421\") " Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.506398 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities" (OuterVolumeSpecName: "utilities") pod "f6ad62de-992d-48ae-a960-1c411ada3421" (UID: "f6ad62de-992d-48ae-a960-1c411ada3421"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.519424 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6" (OuterVolumeSpecName: "kube-api-access-l9tc6") pod "f6ad62de-992d-48ae-a960-1c411ada3421" (UID: "f6ad62de-992d-48ae-a960-1c411ada3421"). InnerVolumeSpecName "kube-api-access-l9tc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.568779 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6ad62de-992d-48ae-a960-1c411ada3421" (UID: "f6ad62de-992d-48ae-a960-1c411ada3421"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.608149 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9tc6\" (UniqueName: \"kubernetes.io/projected/f6ad62de-992d-48ae-a960-1c411ada3421-kube-api-access-l9tc6\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.608450 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:06 crc kubenswrapper[4929]: I1002 12:52:06.608590 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6ad62de-992d-48ae-a960-1c411ada3421-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.274397 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wsp5t" event={"ID":"f6ad62de-992d-48ae-a960-1c411ada3421","Type":"ContainerDied","Data":"95299355a023491313680cdc9e422d394472cb04cfac9ecad74e68a66cc68caf"} Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.274697 4929 scope.go:117] "RemoveContainer" containerID="677056ba74b981ee26a6d0a8bc43cf48748f38eb2b777621142e7575b9d0b728" Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.274755 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wsp5t" Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.303471 4929 scope.go:117] "RemoveContainer" containerID="6639cbe2e5fc62dccad474c00066a59e1d34fe4e5f6403cb141284844c8a2005" Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.332185 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.342511 4929 scope.go:117] "RemoveContainer" containerID="406e274bb44f3773aa7084033bed47a73a8f7e382cb74635b9a8833aaf2e9798" Oct 02 12:52:07 crc kubenswrapper[4929]: I1002 12:52:07.342969 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wsp5t"] Oct 02 12:52:08 crc kubenswrapper[4929]: I1002 12:52:08.169204 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" path="/var/lib/kubelet/pods/f6ad62de-992d-48ae-a960-1c411ada3421/volumes" Oct 02 12:52:09 crc kubenswrapper[4929]: I1002 12:52:09.046604 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qps82"] Oct 02 12:52:09 crc kubenswrapper[4929]: I1002 12:52:09.057087 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qps82"] Oct 02 12:52:10 crc kubenswrapper[4929]: I1002 12:52:10.168307 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee" path="/var/lib/kubelet/pods/11d5a0ec-7b70-400f-94d8-bfa6e6b6ecee/volumes" Oct 02 12:52:19 crc kubenswrapper[4929]: I1002 12:52:19.036496 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2da6-account-create-4nhvz"] Oct 02 12:52:19 crc kubenswrapper[4929]: I1002 12:52:19.045712 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2da6-account-create-4nhvz"] Oct 02 12:52:20 crc kubenswrapper[4929]: I1002 12:52:20.171244 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9d4049-dac2-4fa5-abc5-e048122a0672" path="/var/lib/kubelet/pods/1d9d4049-dac2-4fa5-abc5-e048122a0672/volumes" Oct 02 12:52:21 crc kubenswrapper[4929]: I1002 12:52:21.157184 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:52:21 crc kubenswrapper[4929]: E1002 12:52:21.157625 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:52:24 crc kubenswrapper[4929]: I1002 12:52:24.690073 4929 scope.go:117] "RemoveContainer" containerID="d31ae9d8ee9133aabd44db7e50e650ec7046111eff6296e8818f43699f8db055" Oct 02 12:52:24 crc kubenswrapper[4929]: I1002 12:52:24.714239 4929 scope.go:117] "RemoveContainer" containerID="1ca05a5496be8ad4a20e3f2664248a7507a36abb73e0f9db27329ed932ff256b" Oct 02 12:52:24 crc kubenswrapper[4929]: I1002 12:52:24.798101 4929 scope.go:117] "RemoveContainer" containerID="a29e5e5b7fea7c8049995bba6b38042f5e1721a48dccaee00a451c03da7c1c00" Oct 02 12:52:28 crc kubenswrapper[4929]: I1002 12:52:28.038563 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9plpw"] Oct 02 12:52:28 crc kubenswrapper[4929]: I1002 12:52:28.046301 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9plpw"] Oct 02 12:52:28 crc kubenswrapper[4929]: I1002 12:52:28.168665 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28075288-21df-4796-ae55-feb4d7f91163" path="/var/lib/kubelet/pods/28075288-21df-4796-ae55-feb4d7f91163/volumes" Oct 02 12:52:33 crc kubenswrapper[4929]: I1002 12:52:33.157373 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:52:33 crc kubenswrapper[4929]: E1002 12:52:33.158166 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.236426 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:52:45 crc kubenswrapper[4929]: E1002 12:52:45.237263 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="extract-utilities" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.237276 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="extract-utilities" Oct 02 12:52:45 crc kubenswrapper[4929]: E1002 12:52:45.237306 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="extract-content" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.237312 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="extract-content" Oct 02 12:52:45 crc kubenswrapper[4929]: E1002 12:52:45.237329 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="registry-server" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.237335 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="registry-server" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.237524 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ad62de-992d-48ae-a960-1c411ada3421" containerName="registry-server" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.238518 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.240578 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.240837 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-sbdw6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.240907 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.241483 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.258202 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.309113 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.309149 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.309178 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.309222 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fzxr\" (UniqueName: \"kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.309262 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.311457 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.311683 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-log" containerID="cri-o://d094528a48d858baf682d85fad791241c6657f72c3cfe62385d176e9b62435d5" gracePeriod=30 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.312301 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-httpd" containerID="cri-o://5b3ed6ca35f0875fefeb2e2c714434a74c61fea5d9c263e57f3a282989f4c939" gracePeriod=30 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.396789 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.397103 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-log" containerID="cri-o://300b7b8748f8e764df3c9750de14ac3a0466602dd35dd8d5ff84ec3876f4ba79" gracePeriod=30 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.397288 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-httpd" containerID="cri-o://f45a3dab6740cca432cca97ef98624da4d4a1d3ebd1ecbb2496f7941a21ba577" gracePeriod=30 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.411201 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.411259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.411297 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.411355 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fzxr\" (UniqueName: \"kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.411419 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.412365 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.412628 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.413659 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.416727 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.419111 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.427007 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.432629 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.444218 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fzxr\" (UniqueName: \"kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr\") pod \"horizon-8dd478767-vd5p6\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.560536 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.614681 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.614859 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.615656 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjk46\" (UniqueName: \"kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.615790 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.615822 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.693232 4929 generic.go:334] "Generic (PLEG): container finished" podID="74994619-1056-48dc-aece-0539c1a9ec0f" containerID="300b7b8748f8e764df3c9750de14ac3a0466602dd35dd8d5ff84ec3876f4ba79" exitCode=143 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.693332 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerDied","Data":"300b7b8748f8e764df3c9750de14ac3a0466602dd35dd8d5ff84ec3876f4ba79"} Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.704273 4929 generic.go:334] "Generic (PLEG): container finished" podID="badb9956-b41f-474b-b15d-f65c8486611a" containerID="d094528a48d858baf682d85fad791241c6657f72c3cfe62385d176e9b62435d5" exitCode=143 Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.704645 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerDied","Data":"d094528a48d858baf682d85fad791241c6657f72c3cfe62385d176e9b62435d5"} Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.719744 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.719903 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.720100 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjk46\" (UniqueName: \"kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.720240 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.720358 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.722477 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.723238 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.724856 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.730733 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.748727 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjk46\" (UniqueName: \"kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46\") pod \"horizon-55bbd9c665-gwvzl\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.920426 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.921566 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.933207 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.944422 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:45 crc kubenswrapper[4929]: I1002 12:52:45.944913 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.097229 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.128545 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4s5q\" (UniqueName: \"kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.128757 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.128897 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.129156 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.129404 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.231158 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.231233 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4s5q\" (UniqueName: \"kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.231258 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.231321 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.231386 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.232598 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.234104 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.234726 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.239705 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.247647 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4s5q\" (UniqueName: \"kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q\") pod \"horizon-845cb5b59c-bsq8g\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.273801 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.462432 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.716440 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerStarted","Data":"909587af913f41386972ac42287b5f26522d2022a51fc39fb2c78703382f11c2"} Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.718087 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerStarted","Data":"79e334b5e4c29eb3aeb79eabfb67ffdde2a479047fb3a38231f01fd530a0a27c"} Oct 02 12:52:46 crc kubenswrapper[4929]: I1002 12:52:46.772172 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:52:47 crc kubenswrapper[4929]: I1002 12:52:47.730628 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerStarted","Data":"eaec3f50b16acf1d6b6785a73703f251e4de6f332302d41431bdbb751dc7fe0d"} Oct 02 12:52:48 crc kubenswrapper[4929]: I1002 12:52:48.157555 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:52:51 crc kubenswrapper[4929]: E1002 12:52:48.158527 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:52:51 crc kubenswrapper[4929]: I1002 12:52:48.740949 4929 generic.go:334] "Generic (PLEG): container finished" podID="badb9956-b41f-474b-b15d-f65c8486611a" containerID="5b3ed6ca35f0875fefeb2e2c714434a74c61fea5d9c263e57f3a282989f4c939" exitCode=0 Oct 02 12:52:51 crc kubenswrapper[4929]: I1002 12:52:48.740988 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerDied","Data":"5b3ed6ca35f0875fefeb2e2c714434a74c61fea5d9c263e57f3a282989f4c939"} Oct 02 12:52:51 crc kubenswrapper[4929]: I1002 12:52:49.753631 4929 generic.go:334] "Generic (PLEG): container finished" podID="74994619-1056-48dc-aece-0539c1a9ec0f" containerID="f45a3dab6740cca432cca97ef98624da4d4a1d3ebd1ecbb2496f7941a21ba577" exitCode=0 Oct 02 12:52:51 crc kubenswrapper[4929]: I1002 12:52:49.753682 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerDied","Data":"f45a3dab6740cca432cca97ef98624da4d4a1d3ebd1ecbb2496f7941a21ba577"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.324485 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.342190 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.514213 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.514806 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.514872 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515080 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515196 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpvc\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515277 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515402 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515459 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515510 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515536 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515612 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515644 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data\") pod \"74994619-1056-48dc-aece-0539c1a9ec0f\" (UID: \"74994619-1056-48dc-aece-0539c1a9ec0f\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515698 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs" (OuterVolumeSpecName: "logs") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515719 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nckxd\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515762 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs" (OuterVolumeSpecName: "logs") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.515836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph\") pod \"badb9956-b41f-474b-b15d-f65c8486611a\" (UID: \"badb9956-b41f-474b-b15d-f65c8486611a\") " Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.517170 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.517632 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.517650 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.517660 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/badb9956-b41f-474b-b15d-f65c8486611a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.519832 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.523948 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph" (OuterVolumeSpecName: "ceph") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.524041 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd" (OuterVolumeSpecName: "kube-api-access-nckxd") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "kube-api-access-nckxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.527731 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc" (OuterVolumeSpecName: "kube-api-access-jjpvc") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "kube-api-access-jjpvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.539774 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph" (OuterVolumeSpecName: "ceph") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.551436 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts" (OuterVolumeSpecName: "scripts") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.553117 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts" (OuterVolumeSpecName: "scripts") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619600 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpvc\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-kube-api-access-jjpvc\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619658 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619670 4929 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74994619-1056-48dc-aece-0539c1a9ec0f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619685 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619702 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/74994619-1056-48dc-aece-0539c1a9ec0f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619781 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nckxd\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-kube-api-access-nckxd\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.619795 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/badb9956-b41f-474b-b15d-f65c8486611a-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.638861 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.658356 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.701200 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data" (OuterVolumeSpecName: "config-data") pod "74994619-1056-48dc-aece-0539c1a9ec0f" (UID: "74994619-1056-48dc-aece-0539c1a9ec0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.702855 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data" (OuterVolumeSpecName: "config-data") pod "badb9956-b41f-474b-b15d-f65c8486611a" (UID: "badb9956-b41f-474b-b15d-f65c8486611a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.721853 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.721953 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/badb9956-b41f-474b-b15d-f65c8486611a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.722006 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.722019 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74994619-1056-48dc-aece-0539c1a9ec0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.825759 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerStarted","Data":"6cc546dbf828b7531d876935e21f75bda40766a034ed98564be7367940f44004"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.840699 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerStarted","Data":"fc92936351b77be739a9fc9e71cfaf4dfaaa558cf656d0bc0dd10910d76e9395"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.843475 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74994619-1056-48dc-aece-0539c1a9ec0f","Type":"ContainerDied","Data":"d42e1c7448df57fa43aaeec1aca65b57f785f65ebc368254de8c201007b9b906"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.843503 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.843520 4929 scope.go:117] "RemoveContainer" containerID="f45a3dab6740cca432cca97ef98624da4d4a1d3ebd1ecbb2496f7941a21ba577" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.847093 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"badb9956-b41f-474b-b15d-f65c8486611a","Type":"ContainerDied","Data":"dbea50f9e17f6ef323077d0a35c60452f94ec335255720cec82c328e5b5ab0da"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.847280 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.857111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerStarted","Data":"68c8e7886b320a4616b1f1ff1365e31f6dfa4b3a90ca9dde54e02b3449676842"} Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.897611 4929 scope.go:117] "RemoveContainer" containerID="300b7b8748f8e764df3c9750de14ac3a0466602dd35dd8d5ff84ec3876f4ba79" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.914452 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.954686 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.978444 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:53 crc kubenswrapper[4929]: E1002 12:52:53.978906 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.978924 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: E1002 12:52:53.978943 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.978950 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: E1002 12:52:53.978982 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.978989 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: E1002 12:52:53.979023 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.979029 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.979210 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.979226 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-httpd" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.979246 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="badb9956-b41f-474b-b15d-f65c8486611a" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.979255 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" containerName="glance-log" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.980338 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.984138 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ltb8t" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.987683 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.987940 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 12:52:53 crc kubenswrapper[4929]: I1002 12:52:53.988082 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.007025 4929 scope.go:117] "RemoveContainer" containerID="5b3ed6ca35f0875fefeb2e2c714434a74c61fea5d9c263e57f3a282989f4c939" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.019200 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.052216 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.060514 4929 scope.go:117] "RemoveContainer" containerID="d094528a48d858baf682d85fad791241c6657f72c3cfe62385d176e9b62435d5" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.063727 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.066133 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.073711 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.076015 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140360 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjxk\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-kube-api-access-wxjxk\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140459 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140501 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140546 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140571 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140739 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.140763 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.169929 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74994619-1056-48dc-aece-0539c1a9ec0f" path="/var/lib/kubelet/pods/74994619-1056-48dc-aece-0539c1a9ec0f/volumes" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.170765 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="badb9956-b41f-474b-b15d-f65c8486611a" path="/var/lib/kubelet/pods/badb9956-b41f-474b-b15d-f65c8486611a/volumes" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.243662 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qnmx\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-kube-api-access-9qnmx\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244129 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-logs\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244173 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244213 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244255 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244352 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244418 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjxk\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-kube-api-access-wxjxk\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244495 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244568 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244661 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244777 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244833 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244851 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-ceph\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.244898 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.245405 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.245482 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e17b33c-1789-4c39-b643-484d5bcb4f72-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.250709 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.250866 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-ceph\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.250912 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.257277 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e17b33c-1789-4c39-b643-484d5bcb4f72-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.266712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjxk\" (UniqueName: \"kubernetes.io/projected/1e17b33c-1789-4c39-b643-484d5bcb4f72-kube-api-access-wxjxk\") pod \"glance-default-internal-api-0\" (UID: \"1e17b33c-1789-4c39-b643-484d5bcb4f72\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.340857 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346062 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnmx\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-kube-api-access-9qnmx\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346216 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-logs\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346242 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346267 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346358 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346382 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-ceph\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.346405 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.348160 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.348178 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-logs\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.349900 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-scripts\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.356448 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-ceph\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.359061 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.360844 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-config-data\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.369883 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qnmx\" (UniqueName: \"kubernetes.io/projected/f2d786e6-cfcd-4c6b-98b8-dade07f516d5-kube-api-access-9qnmx\") pod \"glance-default-external-api-0\" (UID: \"f2d786e6-cfcd-4c6b-98b8-dade07f516d5\") " pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.396946 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.893553 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerStarted","Data":"95db534e1418b7befe377c8a35573f95450b06c7ac2997f49049f3a8ad481ab3"} Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.893764 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55bbd9c665-gwvzl" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon-log" containerID="cri-o://68c8e7886b320a4616b1f1ff1365e31f6dfa4b3a90ca9dde54e02b3449676842" gracePeriod=30 Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.894548 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55bbd9c665-gwvzl" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon" containerID="cri-o://95db534e1418b7befe377c8a35573f95450b06c7ac2997f49049f3a8ad481ab3" gracePeriod=30 Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.903742 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerStarted","Data":"7c63ca0c526a29a6c0d15658320ef045d7311a910bf92a13dc5f952159bc0852"} Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.914313 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerStarted","Data":"f4ccb6c320846afc326a2a0d524ec45027924444ea6815643f13ead132c308c1"} Oct 02 12:52:54 crc kubenswrapper[4929]: I1002 12:52:54.931552 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55bbd9c665-gwvzl" podStartSLOduration=3.0417283 podStartE2EDuration="9.931533852s" podCreationTimestamp="2025-10-02 12:52:45 +0000 UTC" firstStartedPulling="2025-10-02 12:52:46.452923589 +0000 UTC m=+6167.003289953" lastFinishedPulling="2025-10-02 12:52:53.342729151 +0000 UTC m=+6173.893095505" observedRunningTime="2025-10-02 12:52:54.915033407 +0000 UTC m=+6175.465399771" watchObservedRunningTime="2025-10-02 12:52:54.931533852 +0000 UTC m=+6175.481900226" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.035193 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8dd478767-vd5p6" podStartSLOduration=2.860349418 podStartE2EDuration="10.035169886s" podCreationTimestamp="2025-10-02 12:52:45 +0000 UTC" firstStartedPulling="2025-10-02 12:52:46.113009543 +0000 UTC m=+6166.663375907" lastFinishedPulling="2025-10-02 12:52:53.287830011 +0000 UTC m=+6173.838196375" observedRunningTime="2025-10-02 12:52:54.994195656 +0000 UTC m=+6175.544562020" watchObservedRunningTime="2025-10-02 12:52:55.035169886 +0000 UTC m=+6175.585536250" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.035353 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-845cb5b59c-bsq8g" podStartSLOduration=3.470061202 podStartE2EDuration="10.035346951s" podCreationTimestamp="2025-10-02 12:52:45 +0000 UTC" firstStartedPulling="2025-10-02 12:52:46.775711262 +0000 UTC m=+6167.326077626" lastFinishedPulling="2025-10-02 12:52:53.340997011 +0000 UTC m=+6173.891363375" observedRunningTime="2025-10-02 12:52:54.965127559 +0000 UTC m=+6175.515494063" watchObservedRunningTime="2025-10-02 12:52:55.035346951 +0000 UTC m=+6175.585713325" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.123301 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:52:55 crc kubenswrapper[4929]: W1002 12:52:55.132276 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e17b33c_1789_4c39_b643_484d5bcb4f72.slice/crio-dacc57dd3c6a35c4dbe10ca517339125991c0e5427bf5291c06b715637222c1e WatchSource:0}: Error finding container dacc57dd3c6a35c4dbe10ca517339125991c0e5427bf5291c06b715637222c1e: Status 404 returned error can't find the container with id dacc57dd3c6a35c4dbe10ca517339125991c0e5427bf5291c06b715637222c1e Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.271896 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:52:55 crc kubenswrapper[4929]: W1002 12:52:55.288516 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2d786e6_cfcd_4c6b_98b8_dade07f516d5.slice/crio-3ed1b443d141aca43cf759cef26a5da7ecbe1d9c1dbf63341a5057b6f1f6d238 WatchSource:0}: Error finding container 3ed1b443d141aca43cf759cef26a5da7ecbe1d9c1dbf63341a5057b6f1f6d238: Status 404 returned error can't find the container with id 3ed1b443d141aca43cf759cef26a5da7ecbe1d9c1dbf63341a5057b6f1f6d238 Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.561271 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.561381 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.925104 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.960243 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2d786e6-cfcd-4c6b-98b8-dade07f516d5","Type":"ContainerStarted","Data":"3ed1b443d141aca43cf759cef26a5da7ecbe1d9c1dbf63341a5057b6f1f6d238"} Oct 02 12:52:55 crc kubenswrapper[4929]: I1002 12:52:55.975348 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e17b33c-1789-4c39-b643-484d5bcb4f72","Type":"ContainerStarted","Data":"dacc57dd3c6a35c4dbe10ca517339125991c0e5427bf5291c06b715637222c1e"} Oct 02 12:52:56 crc kubenswrapper[4929]: I1002 12:52:56.274233 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:56 crc kubenswrapper[4929]: I1002 12:52:56.274573 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:52:56 crc kubenswrapper[4929]: I1002 12:52:56.988610 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e17b33c-1789-4c39-b643-484d5bcb4f72","Type":"ContainerStarted","Data":"144d57ea72176970ba0b28d6559ee23d902f79f92b9902d1029141d186c6c770"} Oct 02 12:52:56 crc kubenswrapper[4929]: I1002 12:52:56.992658 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2d786e6-cfcd-4c6b-98b8-dade07f516d5","Type":"ContainerStarted","Data":"8ed494c9815c4167619d0d3ab2d460afdbcff1405f300b567734605a8daf7ee1"} Oct 02 12:52:58 crc kubenswrapper[4929]: I1002 12:52:58.012755 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f2d786e6-cfcd-4c6b-98b8-dade07f516d5","Type":"ContainerStarted","Data":"f8616a0b2468362327e0f0f506140529c92c64ff272fbbd603e3ad8003313de4"} Oct 02 12:52:58 crc kubenswrapper[4929]: I1002 12:52:58.017201 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e17b33c-1789-4c39-b643-484d5bcb4f72","Type":"ContainerStarted","Data":"c961877d5b8d1be38ad4c6a4ba959df426c045a3d63199ad3ad6b960249f45a9"} Oct 02 12:52:58 crc kubenswrapper[4929]: I1002 12:52:58.065672 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.065640521 podStartE2EDuration="5.065640521s" podCreationTimestamp="2025-10-02 12:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:52:58.045213683 +0000 UTC m=+6178.595580047" watchObservedRunningTime="2025-10-02 12:52:58.065640521 +0000 UTC m=+6178.616006885" Oct 02 12:53:00 crc kubenswrapper[4929]: I1002 12:53:00.024556 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.024533276 podStartE2EDuration="7.024533276s" podCreationTimestamp="2025-10-02 12:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:52:58.0680592 +0000 UTC m=+6178.618425564" watchObservedRunningTime="2025-10-02 12:53:00.024533276 +0000 UTC m=+6180.574899640" Oct 02 12:53:00 crc kubenswrapper[4929]: I1002 12:53:00.034643 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p528n"] Oct 02 12:53:00 crc kubenswrapper[4929]: I1002 12:53:00.045810 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p528n"] Oct 02 12:53:00 crc kubenswrapper[4929]: I1002 12:53:00.175680 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca64815-0edf-4d20-aa7f-386f89c5f1e2" path="/var/lib/kubelet/pods/8ca64815-0edf-4d20-aa7f-386f89c5f1e2/volumes" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.085012 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.087995 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.098668 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.260560 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.260820 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckd65\" (UniqueName: \"kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.261130 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.363397 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.363560 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.363586 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckd65\" (UniqueName: \"kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.364059 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.364102 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.386950 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckd65\" (UniqueName: \"kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65\") pod \"redhat-marketplace-5dmh9\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.426751 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:01 crc kubenswrapper[4929]: I1002 12:53:01.912374 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:01 crc kubenswrapper[4929]: W1002 12:53:01.915104 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda004cb2_7ffe_45da_ad05_e58e97edf9fa.slice/crio-4ee63f1a99f2853e1a74f9ce2b3fc868997022a38948384133c5f3aa8fc6d7d4 WatchSource:0}: Error finding container 4ee63f1a99f2853e1a74f9ce2b3fc868997022a38948384133c5f3aa8fc6d7d4: Status 404 returned error can't find the container with id 4ee63f1a99f2853e1a74f9ce2b3fc868997022a38948384133c5f3aa8fc6d7d4 Oct 02 12:53:02 crc kubenswrapper[4929]: I1002 12:53:02.065620 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerStarted","Data":"4ee63f1a99f2853e1a74f9ce2b3fc868997022a38948384133c5f3aa8fc6d7d4"} Oct 02 12:53:02 crc kubenswrapper[4929]: I1002 12:53:02.157661 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:53:02 crc kubenswrapper[4929]: E1002 12:53:02.158243 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:53:03 crc kubenswrapper[4929]: I1002 12:53:03.080993 4929 generic.go:334] "Generic (PLEG): container finished" podID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerID="1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11" exitCode=0 Oct 02 12:53:03 crc kubenswrapper[4929]: I1002 12:53:03.081045 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerDied","Data":"1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11"} Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.341241 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.342592 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.383543 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.385996 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.397073 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.397336 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.436501 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:53:04 crc kubenswrapper[4929]: I1002 12:53:04.444797 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.107822 4929 generic.go:334] "Generic (PLEG): container finished" podID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerID="c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a" exitCode=0 Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.107940 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerDied","Data":"c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a"} Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.108809 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.108843 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.108855 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.109778 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:53:05 crc kubenswrapper[4929]: I1002 12:53:05.562733 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 02 12:53:06 crc kubenswrapper[4929]: I1002 12:53:06.277706 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.124553 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.124836 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.323569 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.323751 4929 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.329665 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.699065 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:53:07 crc kubenswrapper[4929]: I1002 12:53:07.701587 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:53:09 crc kubenswrapper[4929]: I1002 12:53:09.146284 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerStarted","Data":"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584"} Oct 02 12:53:09 crc kubenswrapper[4929]: I1002 12:53:09.177714 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5dmh9" podStartSLOduration=3.150081748 podStartE2EDuration="8.177694839s" podCreationTimestamp="2025-10-02 12:53:01 +0000 UTC" firstStartedPulling="2025-10-02 12:53:03.083865562 +0000 UTC m=+6183.634231926" lastFinishedPulling="2025-10-02 12:53:08.111478653 +0000 UTC m=+6188.661845017" observedRunningTime="2025-10-02 12:53:09.167912927 +0000 UTC m=+6189.718279301" watchObservedRunningTime="2025-10-02 12:53:09.177694839 +0000 UTC m=+6189.728061203" Oct 02 12:53:10 crc kubenswrapper[4929]: I1002 12:53:10.028923 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76f2-account-create-x49sr"] Oct 02 12:53:10 crc kubenswrapper[4929]: I1002 12:53:10.039069 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76f2-account-create-x49sr"] Oct 02 12:53:10 crc kubenswrapper[4929]: I1002 12:53:10.174475 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e04b99-9193-46d9-9911-2ea9d88d0cc1" path="/var/lib/kubelet/pods/79e04b99-9193-46d9-9911-2ea9d88d0cc1/volumes" Oct 02 12:53:11 crc kubenswrapper[4929]: I1002 12:53:11.427058 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:11 crc kubenswrapper[4929]: I1002 12:53:11.427099 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:12 crc kubenswrapper[4929]: I1002 12:53:12.477151 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5dmh9" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="registry-server" probeResult="failure" output=< Oct 02 12:53:12 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 12:53:12 crc kubenswrapper[4929]: > Oct 02 12:53:15 crc kubenswrapper[4929]: I1002 12:53:15.158169 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:53:15 crc kubenswrapper[4929]: E1002 12:53:15.159171 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:53:17 crc kubenswrapper[4929]: I1002 12:53:17.770789 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:53:18 crc kubenswrapper[4929]: I1002 12:53:18.206709 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:53:19 crc kubenswrapper[4929]: I1002 12:53:19.519867 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.030738 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nfwkm"] Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.040415 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nfwkm"] Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.121356 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.179589 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084f7e37-6fec-4747-9bc6-c1e0bff98ab1" path="/var/lib/kubelet/pods/084f7e37-6fec-4747-9bc6-c1e0bff98ab1/volumes" Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.184535 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.271529 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon-log" containerID="cri-o://fc92936351b77be739a9fc9e71cfaf4dfaaa558cf656d0bc0dd10910d76e9395" gracePeriod=30 Oct 02 12:53:20 crc kubenswrapper[4929]: I1002 12:53:20.271604 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" containerID="cri-o://f4ccb6c320846afc326a2a0d524ec45027924444ea6815643f13ead132c308c1" gracePeriod=30 Oct 02 12:53:21 crc kubenswrapper[4929]: I1002 12:53:21.475678 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:21 crc kubenswrapper[4929]: I1002 12:53:21.526941 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:21 crc kubenswrapper[4929]: I1002 12:53:21.711255 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.299769 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5dmh9" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="registry-server" containerID="cri-o://fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584" gracePeriod=2 Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.821234 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.963594 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities\") pod \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.963722 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content\") pod \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.963917 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckd65\" (UniqueName: \"kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65\") pod \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\" (UID: \"da004cb2-7ffe-45da-ad05-e58e97edf9fa\") " Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.964539 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities" (OuterVolumeSpecName: "utilities") pod "da004cb2-7ffe-45da-ad05-e58e97edf9fa" (UID: "da004cb2-7ffe-45da-ad05-e58e97edf9fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.969727 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65" (OuterVolumeSpecName: "kube-api-access-ckd65") pod "da004cb2-7ffe-45da-ad05-e58e97edf9fa" (UID: "da004cb2-7ffe-45da-ad05-e58e97edf9fa"). InnerVolumeSpecName "kube-api-access-ckd65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:23 crc kubenswrapper[4929]: I1002 12:53:23.976502 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da004cb2-7ffe-45da-ad05-e58e97edf9fa" (UID: "da004cb2-7ffe-45da-ad05-e58e97edf9fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.066597 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckd65\" (UniqueName: \"kubernetes.io/projected/da004cb2-7ffe-45da-ad05-e58e97edf9fa-kube-api-access-ckd65\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.066638 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.066648 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da004cb2-7ffe-45da-ad05-e58e97edf9fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.313224 4929 generic.go:334] "Generic (PLEG): container finished" podID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerID="fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584" exitCode=0 Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.313346 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dmh9" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.313325 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerDied","Data":"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584"} Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.313899 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dmh9" event={"ID":"da004cb2-7ffe-45da-ad05-e58e97edf9fa","Type":"ContainerDied","Data":"4ee63f1a99f2853e1a74f9ce2b3fc868997022a38948384133c5f3aa8fc6d7d4"} Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.313925 4929 scope.go:117] "RemoveContainer" containerID="fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.317234 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerID="f4ccb6c320846afc326a2a0d524ec45027924444ea6815643f13ead132c308c1" exitCode=0 Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.317299 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerDied","Data":"f4ccb6c320846afc326a2a0d524ec45027924444ea6815643f13ead132c308c1"} Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.349857 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.352319 4929 scope.go:117] "RemoveContainer" containerID="c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.360348 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dmh9"] Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.374746 4929 scope.go:117] "RemoveContainer" containerID="1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.400941 4929 scope.go:117] "RemoveContainer" containerID="fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584" Oct 02 12:53:24 crc kubenswrapper[4929]: E1002 12:53:24.401614 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584\": container with ID starting with fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584 not found: ID does not exist" containerID="fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.401670 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584"} err="failed to get container status \"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584\": rpc error: code = NotFound desc = could not find container \"fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584\": container with ID starting with fb74bbefb90cfa63b837cb0ec21d63df234f971599a5279e739a8c7360858584 not found: ID does not exist" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.401705 4929 scope.go:117] "RemoveContainer" containerID="c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a" Oct 02 12:53:24 crc kubenswrapper[4929]: E1002 12:53:24.402691 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a\": container with ID starting with c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a not found: ID does not exist" containerID="c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.402728 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a"} err="failed to get container status \"c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a\": rpc error: code = NotFound desc = could not find container \"c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a\": container with ID starting with c793dbd82a3eb49be6bce30e8a864019739025c2b7c2ca3e0eb66d677193cc5a not found: ID does not exist" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.402757 4929 scope.go:117] "RemoveContainer" containerID="1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11" Oct 02 12:53:24 crc kubenswrapper[4929]: E1002 12:53:24.403344 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11\": container with ID starting with 1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11 not found: ID does not exist" containerID="1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.403427 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11"} err="failed to get container status \"1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11\": rpc error: code = NotFound desc = could not find container \"1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11\": container with ID starting with 1c648609f71ae3f906055509ca80555041164213041cd353791650f161d9de11 not found: ID does not exist" Oct 02 12:53:24 crc kubenswrapper[4929]: I1002 12:53:24.923025 4929 scope.go:117] "RemoveContainer" containerID="9f95995e2b136312306021883267e372280cb3d03bf64817eab5d03336f9b74b" Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.050189 4929 scope.go:117] "RemoveContainer" containerID="30362c07cff42ad908bb82c2f78e65f4fb77d025662016d1e5492496521dedd2" Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.111814 4929 scope.go:117] "RemoveContainer" containerID="480967e8cb034ad36623d07ca3bb12ec979c51f4eeea40530cb1f61007bf6da3" Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.222467 4929 scope.go:117] "RemoveContainer" containerID="06669eb5e32fa6abbc08f9693688c2ff9ac4da00ee276f8e87f8e0916ea73e51" Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.342900 4929 generic.go:334] "Generic (PLEG): container finished" podID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerID="95db534e1418b7befe377c8a35573f95450b06c7ac2997f49049f3a8ad481ab3" exitCode=137 Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.342930 4929 generic.go:334] "Generic (PLEG): container finished" podID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerID="68c8e7886b320a4616b1f1ff1365e31f6dfa4b3a90ca9dde54e02b3449676842" exitCode=137 Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.343117 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerDied","Data":"95db534e1418b7befe377c8a35573f95450b06c7ac2997f49049f3a8ad481ab3"} Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.343179 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerDied","Data":"68c8e7886b320a4616b1f1ff1365e31f6dfa4b3a90ca9dde54e02b3449676842"} Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.562490 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 02 12:53:25 crc kubenswrapper[4929]: I1002 12:53:25.938788 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.042308 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts\") pod \"fcece101-49b0-4c88-8035-f1437c1b4ffb\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.042599 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data\") pod \"fcece101-49b0-4c88-8035-f1437c1b4ffb\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.042817 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key\") pod \"fcece101-49b0-4c88-8035-f1437c1b4ffb\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.042935 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs\") pod \"fcece101-49b0-4c88-8035-f1437c1b4ffb\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.043140 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjk46\" (UniqueName: \"kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46\") pod \"fcece101-49b0-4c88-8035-f1437c1b4ffb\" (UID: \"fcece101-49b0-4c88-8035-f1437c1b4ffb\") " Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.043553 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs" (OuterVolumeSpecName: "logs") pod "fcece101-49b0-4c88-8035-f1437c1b4ffb" (UID: "fcece101-49b0-4c88-8035-f1437c1b4ffb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.044197 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcece101-49b0-4c88-8035-f1437c1b4ffb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.048270 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fcece101-49b0-4c88-8035-f1437c1b4ffb" (UID: "fcece101-49b0-4c88-8035-f1437c1b4ffb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.049739 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46" (OuterVolumeSpecName: "kube-api-access-qjk46") pod "fcece101-49b0-4c88-8035-f1437c1b4ffb" (UID: "fcece101-49b0-4c88-8035-f1437c1b4ffb"). InnerVolumeSpecName "kube-api-access-qjk46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.070101 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts" (OuterVolumeSpecName: "scripts") pod "fcece101-49b0-4c88-8035-f1437c1b4ffb" (UID: "fcece101-49b0-4c88-8035-f1437c1b4ffb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.082622 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data" (OuterVolumeSpecName: "config-data") pod "fcece101-49b0-4c88-8035-f1437c1b4ffb" (UID: "fcece101-49b0-4c88-8035-f1437c1b4ffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.146304 4929 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcece101-49b0-4c88-8035-f1437c1b4ffb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.146354 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjk46\" (UniqueName: \"kubernetes.io/projected/fcece101-49b0-4c88-8035-f1437c1b4ffb-kube-api-access-qjk46\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.146370 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.146381 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcece101-49b0-4c88-8035-f1437c1b4ffb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.171202 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" path="/var/lib/kubelet/pods/da004cb2-7ffe-45da-ad05-e58e97edf9fa/volumes" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.359215 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55bbd9c665-gwvzl" event={"ID":"fcece101-49b0-4c88-8035-f1437c1b4ffb","Type":"ContainerDied","Data":"909587af913f41386972ac42287b5f26522d2022a51fc39fb2c78703382f11c2"} Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.359282 4929 scope.go:117] "RemoveContainer" containerID="95db534e1418b7befe377c8a35573f95450b06c7ac2997f49049f3a8ad481ab3" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.359312 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55bbd9c665-gwvzl" Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.387344 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.396120 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55bbd9c665-gwvzl"] Oct 02 12:53:26 crc kubenswrapper[4929]: I1002 12:53:26.556237 4929 scope.go:117] "RemoveContainer" containerID="68c8e7886b320a4616b1f1ff1365e31f6dfa4b3a90ca9dde54e02b3449676842" Oct 02 12:53:28 crc kubenswrapper[4929]: I1002 12:53:28.156472 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:53:28 crc kubenswrapper[4929]: E1002 12:53:28.157354 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:53:28 crc kubenswrapper[4929]: I1002 12:53:28.180708 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" path="/var/lib/kubelet/pods/fcece101-49b0-4c88-8035-f1437c1b4ffb/volumes" Oct 02 12:53:35 crc kubenswrapper[4929]: I1002 12:53:35.561544 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 02 12:53:43 crc kubenswrapper[4929]: I1002 12:53:43.157219 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:53:43 crc kubenswrapper[4929]: E1002 12:53:43.157914 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:53:45 crc kubenswrapper[4929]: I1002 12:53:45.561481 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8dd478767-vd5p6" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Oct 02 12:53:45 crc kubenswrapper[4929]: I1002 12:53:45.562246 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.596609 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerID="fc92936351b77be739a9fc9e71cfaf4dfaaa558cf656d0bc0dd10910d76e9395" exitCode=137 Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.596775 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerDied","Data":"fc92936351b77be739a9fc9e71cfaf4dfaaa558cf656d0bc0dd10910d76e9395"} Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.788835 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.863754 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key\") pod \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.863814 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs\") pod \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.863937 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts\") pod \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.864049 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data\") pod \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.864121 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fzxr\" (UniqueName: \"kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr\") pod \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\" (UID: \"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4\") " Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.864513 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs" (OuterVolumeSpecName: "logs") pod "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" (UID: "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.864670 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.869267 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" (UID: "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.869776 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr" (OuterVolumeSpecName: "kube-api-access-9fzxr") pod "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" (UID: "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4"). InnerVolumeSpecName "kube-api-access-9fzxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.891196 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts" (OuterVolumeSpecName: "scripts") pod "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" (UID: "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.903000 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data" (OuterVolumeSpecName: "config-data") pod "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" (UID: "d1f3507b-c1a5-41e4-ba40-7eccbf3118b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.966997 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.967043 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fzxr\" (UniqueName: \"kubernetes.io/projected/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-kube-api-access-9fzxr\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.967055 4929 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:50 crc kubenswrapper[4929]: I1002 12:53:50.967064 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.611491 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8dd478767-vd5p6" event={"ID":"d1f3507b-c1a5-41e4-ba40-7eccbf3118b4","Type":"ContainerDied","Data":"79e334b5e4c29eb3aeb79eabfb67ffdde2a479047fb3a38231f01fd530a0a27c"} Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.611866 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8dd478767-vd5p6" Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.612117 4929 scope.go:117] "RemoveContainer" containerID="f4ccb6c320846afc326a2a0d524ec45027924444ea6815643f13ead132c308c1" Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.664778 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.675121 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8dd478767-vd5p6"] Oct 02 12:53:51 crc kubenswrapper[4929]: I1002 12:53:51.803296 4929 scope.go:117] "RemoveContainer" containerID="fc92936351b77be739a9fc9e71cfaf4dfaaa558cf656d0bc0dd10910d76e9395" Oct 02 12:53:52 crc kubenswrapper[4929]: I1002 12:53:52.167684 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" path="/var/lib/kubelet/pods/d1f3507b-c1a5-41e4-ba40-7eccbf3118b4/volumes" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697183 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64cf64777-tqdzr"] Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697853 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697865 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697884 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697890 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697902 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="registry-server" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697909 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="registry-server" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697923 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="extract-content" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697929 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="extract-content" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697943 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="extract-utilities" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697949 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="extract-utilities" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697975 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697981 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: E1002 12:53:53.697992 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.697998 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.698167 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.698182 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcece101-49b0-4c88-8035-f1437c1b4ffb" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.698193 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon-log" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.698213 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f3507b-c1a5-41e4-ba40-7eccbf3118b4" containerName="horizon" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.698221 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="da004cb2-7ffe-45da-ad05-e58e97edf9fa" containerName="registry-server" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.699374 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.761982 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64cf64777-tqdzr"] Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.819208 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-logs\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.819260 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-horizon-secret-key\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.819291 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-config-data\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.819420 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8dh\" (UniqueName: \"kubernetes.io/projected/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-kube-api-access-tz8dh\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.819496 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-scripts\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.921920 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8dh\" (UniqueName: \"kubernetes.io/projected/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-kube-api-access-tz8dh\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.922005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-scripts\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.922163 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-logs\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.922192 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-horizon-secret-key\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.922228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-config-data\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.922935 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-logs\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.923234 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-scripts\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.925327 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-config-data\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.927865 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-horizon-secret-key\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:53 crc kubenswrapper[4929]: I1002 12:53:53.940713 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8dh\" (UniqueName: \"kubernetes.io/projected/bfdf4e39-cc15-45f7-a15a-4b99136c1e6d-kube-api-access-tz8dh\") pod \"horizon-64cf64777-tqdzr\" (UID: \"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d\") " pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:54 crc kubenswrapper[4929]: I1002 12:53:54.022347 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:53:54 crc kubenswrapper[4929]: I1002 12:53:54.588702 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64cf64777-tqdzr"] Oct 02 12:53:54 crc kubenswrapper[4929]: I1002 12:53:54.685787 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cf64777-tqdzr" event={"ID":"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d","Type":"ContainerStarted","Data":"115ccc39176e5e4b7542bd5d29f678511da8339279108cd2df8f2eca6791ec76"} Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.156539 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:53:55 crc kubenswrapper[4929]: E1002 12:53:55.157110 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.698743 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-ndzrx"] Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.701338 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cf64777-tqdzr" event={"ID":"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d","Type":"ContainerStarted","Data":"57b8b513592d23ca6f8fe3f6fa800025007a935fedffb29d995c954646f2f314"} Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.701389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64cf64777-tqdzr" event={"ID":"bfdf4e39-cc15-45f7-a15a-4b99136c1e6d","Type":"ContainerStarted","Data":"fcb7a9a95050c7adb8b022b4998af3ee5c2f73981dd3782b5117302dfa219197"} Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.701469 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.708117 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ndzrx"] Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.751923 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64cf64777-tqdzr" podStartSLOduration=2.751905463 podStartE2EDuration="2.751905463s" podCreationTimestamp="2025-10-02 12:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:53:55.734647486 +0000 UTC m=+6236.285013850" watchObservedRunningTime="2025-10-02 12:53:55.751905463 +0000 UTC m=+6236.302271827" Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.780853 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdbz\" (UniqueName: \"kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz\") pod \"heat-db-create-ndzrx\" (UID: \"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a\") " pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.882863 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdbz\" (UniqueName: \"kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz\") pod \"heat-db-create-ndzrx\" (UID: \"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a\") " pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:55 crc kubenswrapper[4929]: I1002 12:53:55.902833 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdbz\" (UniqueName: \"kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz\") pod \"heat-db-create-ndzrx\" (UID: \"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a\") " pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:56 crc kubenswrapper[4929]: I1002 12:53:56.018799 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:56 crc kubenswrapper[4929]: I1002 12:53:56.572512 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ndzrx"] Oct 02 12:53:56 crc kubenswrapper[4929]: I1002 12:53:56.715474 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ndzrx" event={"ID":"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a","Type":"ContainerStarted","Data":"f47c886c581a3fbfb605140386f8a8186eecc7241043eb66f914632e316ae6d6"} Oct 02 12:53:57 crc kubenswrapper[4929]: I1002 12:53:57.726912 4929 generic.go:334] "Generic (PLEG): container finished" podID="6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" containerID="d5124b2d6d0bef2eac6f4bf796108c5a0564dd4e56ba73c65ba1740ea9cc8157" exitCode=0 Oct 02 12:53:57 crc kubenswrapper[4929]: I1002 12:53:57.727001 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ndzrx" event={"ID":"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a","Type":"ContainerDied","Data":"d5124b2d6d0bef2eac6f4bf796108c5a0564dd4e56ba73c65ba1740ea9cc8157"} Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.166416 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ndzrx" Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.286759 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxdbz\" (UniqueName: \"kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz\") pod \"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a\" (UID: \"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a\") " Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.293313 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz" (OuterVolumeSpecName: "kube-api-access-kxdbz") pod "6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" (UID: "6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a"). InnerVolumeSpecName "kube-api-access-kxdbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.389425 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxdbz\" (UniqueName: \"kubernetes.io/projected/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a-kube-api-access-kxdbz\") on node \"crc\" DevicePath \"\"" Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.750863 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ndzrx" event={"ID":"6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a","Type":"ContainerDied","Data":"f47c886c581a3fbfb605140386f8a8186eecc7241043eb66f914632e316ae6d6"} Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.750907 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f47c886c581a3fbfb605140386f8a8186eecc7241043eb66f914632e316ae6d6" Oct 02 12:53:59 crc kubenswrapper[4929]: I1002 12:53:59.750926 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ndzrx" Oct 02 12:54:04 crc kubenswrapper[4929]: I1002 12:54:04.023537 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:54:04 crc kubenswrapper[4929]: I1002 12:54:04.024187 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.838446 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-5fca-account-create-2gv4t"] Oct 02 12:54:05 crc kubenswrapper[4929]: E1002 12:54:05.840340 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" containerName="mariadb-database-create" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.840365 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" containerName="mariadb-database-create" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.840701 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" containerName="mariadb-database-create" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.842331 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.849308 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.871881 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5fca-account-create-2gv4t"] Oct 02 12:54:05 crc kubenswrapper[4929]: I1002 12:54:05.934328 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzj8j\" (UniqueName: \"kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j\") pod \"heat-5fca-account-create-2gv4t\" (UID: \"fd379653-5579-4af4-b401-756c91cd6f66\") " pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:06 crc kubenswrapper[4929]: I1002 12:54:06.037501 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzj8j\" (UniqueName: \"kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j\") pod \"heat-5fca-account-create-2gv4t\" (UID: \"fd379653-5579-4af4-b401-756c91cd6f66\") " pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:06 crc kubenswrapper[4929]: I1002 12:54:06.056013 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzj8j\" (UniqueName: \"kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j\") pod \"heat-5fca-account-create-2gv4t\" (UID: \"fd379653-5579-4af4-b401-756c91cd6f66\") " pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:06 crc kubenswrapper[4929]: I1002 12:54:06.169666 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:06 crc kubenswrapper[4929]: I1002 12:54:06.710472 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5fca-account-create-2gv4t"] Oct 02 12:54:06 crc kubenswrapper[4929]: I1002 12:54:06.830457 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5fca-account-create-2gv4t" event={"ID":"fd379653-5579-4af4-b401-756c91cd6f66","Type":"ContainerStarted","Data":"ecc93ccb573e0ff8519116d39afe8e23f9c253ed55f6bfca1b4412d99b6d535f"} Oct 02 12:54:07 crc kubenswrapper[4929]: I1002 12:54:07.850069 4929 generic.go:334] "Generic (PLEG): container finished" podID="fd379653-5579-4af4-b401-756c91cd6f66" containerID="8f116a5465a78365b06baefe321fbfd3c163be569eab3595a3630362510dd6c8" exitCode=0 Oct 02 12:54:07 crc kubenswrapper[4929]: I1002 12:54:07.850462 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5fca-account-create-2gv4t" event={"ID":"fd379653-5579-4af4-b401-756c91cd6f66","Type":"ContainerDied","Data":"8f116a5465a78365b06baefe321fbfd3c163be569eab3595a3630362510dd6c8"} Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.295989 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.413019 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzj8j\" (UniqueName: \"kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j\") pod \"fd379653-5579-4af4-b401-756c91cd6f66\" (UID: \"fd379653-5579-4af4-b401-756c91cd6f66\") " Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.418938 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j" (OuterVolumeSpecName: "kube-api-access-jzj8j") pod "fd379653-5579-4af4-b401-756c91cd6f66" (UID: "fd379653-5579-4af4-b401-756c91cd6f66"). InnerVolumeSpecName "kube-api-access-jzj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.517218 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzj8j\" (UniqueName: \"kubernetes.io/projected/fd379653-5579-4af4-b401-756c91cd6f66-kube-api-access-jzj8j\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.889311 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5fca-account-create-2gv4t" event={"ID":"fd379653-5579-4af4-b401-756c91cd6f66","Type":"ContainerDied","Data":"ecc93ccb573e0ff8519116d39afe8e23f9c253ed55f6bfca1b4412d99b6d535f"} Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.889563 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc93ccb573e0ff8519116d39afe8e23f9c253ed55f6bfca1b4412d99b6d535f" Oct 02 12:54:09 crc kubenswrapper[4929]: I1002 12:54:09.889368 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5fca-account-create-2gv4t" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.165632 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:54:10 crc kubenswrapper[4929]: E1002 12:54:10.166328 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.889096 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wdlvs"] Oct 02 12:54:10 crc kubenswrapper[4929]: E1002 12:54:10.890296 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd379653-5579-4af4-b401-756c91cd6f66" containerName="mariadb-account-create" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.890380 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd379653-5579-4af4-b401-756c91cd6f66" containerName="mariadb-account-create" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.890627 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd379653-5579-4af4-b401-756c91cd6f66" containerName="mariadb-account-create" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.891483 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.894896 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f2wcg" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.895460 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.902923 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wdlvs"] Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.959139 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.959251 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:10 crc kubenswrapper[4929]: I1002 12:54:10.959331 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lh8\" (UniqueName: \"kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.060660 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.060998 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.061124 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lh8\" (UniqueName: \"kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.067513 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.077140 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.081105 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lh8\" (UniqueName: \"kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8\") pod \"heat-db-sync-wdlvs\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.228612 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.716675 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wdlvs"] Oct 02 12:54:11 crc kubenswrapper[4929]: I1002 12:54:11.908796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wdlvs" event={"ID":"2f2729fe-14bc-44f9-bedb-e9745024b9d9","Type":"ContainerStarted","Data":"2d79e2d7c5c0129171c1b70b983ae40fd7b66bb2a2285b6740e55f6b68bc0b47"} Oct 02 12:54:14 crc kubenswrapper[4929]: I1002 12:54:14.025262 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64cf64777-tqdzr" podUID="bfdf4e39-cc15-45f7-a15a-4b99136c1e6d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.056381 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9k24b"] Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.065851 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gpzrh"] Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.074296 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9k24b"] Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.082220 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gpzrh"] Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.170836 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d9ec94a-8101-44c6-a92f-9999dcb58e1a" path="/var/lib/kubelet/pods/2d9ec94a-8101-44c6-a92f-9999dcb58e1a/volumes" Oct 02 12:54:16 crc kubenswrapper[4929]: I1002 12:54:16.179438 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef2c687-95a7-4ca2-a7b7-eb93733b1101" path="/var/lib/kubelet/pods/9ef2c687-95a7-4ca2-a7b7-eb93733b1101/volumes" Oct 02 12:54:17 crc kubenswrapper[4929]: I1002 12:54:17.027638 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-96nmh"] Oct 02 12:54:17 crc kubenswrapper[4929]: I1002 12:54:17.036039 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-96nmh"] Oct 02 12:54:18 crc kubenswrapper[4929]: I1002 12:54:18.184499 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a55624-e5d7-4ee6-b23d-8dffe83d54b3" path="/var/lib/kubelet/pods/08a55624-e5d7-4ee6-b23d-8dffe83d54b3/volumes" Oct 02 12:54:21 crc kubenswrapper[4929]: I1002 12:54:21.032929 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wdlvs" event={"ID":"2f2729fe-14bc-44f9-bedb-e9745024b9d9","Type":"ContainerStarted","Data":"a89b35fae137a9efc2f21e631e6e6762983d9e3df8442f825611d841f7745c6b"} Oct 02 12:54:21 crc kubenswrapper[4929]: I1002 12:54:21.057057 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wdlvs" podStartSLOduration=2.959061563 podStartE2EDuration="11.057038517s" podCreationTimestamp="2025-10-02 12:54:10 +0000 UTC" firstStartedPulling="2025-10-02 12:54:11.730192965 +0000 UTC m=+6252.280559329" lastFinishedPulling="2025-10-02 12:54:19.828169909 +0000 UTC m=+6260.378536283" observedRunningTime="2025-10-02 12:54:21.055930195 +0000 UTC m=+6261.606296569" watchObservedRunningTime="2025-10-02 12:54:21.057038517 +0000 UTC m=+6261.607404881" Oct 02 12:54:22 crc kubenswrapper[4929]: I1002 12:54:22.045890 4929 generic.go:334] "Generic (PLEG): container finished" podID="2f2729fe-14bc-44f9-bedb-e9745024b9d9" containerID="a89b35fae137a9efc2f21e631e6e6762983d9e3df8442f825611d841f7745c6b" exitCode=0 Oct 02 12:54:22 crc kubenswrapper[4929]: I1002 12:54:22.045937 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wdlvs" event={"ID":"2f2729fe-14bc-44f9-bedb-e9745024b9d9","Type":"ContainerDied","Data":"a89b35fae137a9efc2f21e631e6e6762983d9e3df8442f825611d841f7745c6b"} Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.162970 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:54:23 crc kubenswrapper[4929]: E1002 12:54:23.163723 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.399218 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.537857 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle\") pod \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.538104 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2lh8\" (UniqueName: \"kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8\") pod \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.538424 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data\") pod \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\" (UID: \"2f2729fe-14bc-44f9-bedb-e9745024b9d9\") " Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.549485 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8" (OuterVolumeSpecName: "kube-api-access-l2lh8") pod "2f2729fe-14bc-44f9-bedb-e9745024b9d9" (UID: "2f2729fe-14bc-44f9-bedb-e9745024b9d9"). InnerVolumeSpecName "kube-api-access-l2lh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.582141 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f2729fe-14bc-44f9-bedb-e9745024b9d9" (UID: "2f2729fe-14bc-44f9-bedb-e9745024b9d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.620563 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data" (OuterVolumeSpecName: "config-data") pod "2f2729fe-14bc-44f9-bedb-e9745024b9d9" (UID: "2f2729fe-14bc-44f9-bedb-e9745024b9d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.641649 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.641702 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2729fe-14bc-44f9-bedb-e9745024b9d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:23 crc kubenswrapper[4929]: I1002 12:54:23.641719 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2lh8\" (UniqueName: \"kubernetes.io/projected/2f2729fe-14bc-44f9-bedb-e9745024b9d9-kube-api-access-l2lh8\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:24 crc kubenswrapper[4929]: I1002 12:54:24.070493 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wdlvs" event={"ID":"2f2729fe-14bc-44f9-bedb-e9745024b9d9","Type":"ContainerDied","Data":"2d79e2d7c5c0129171c1b70b983ae40fd7b66bb2a2285b6740e55f6b68bc0b47"} Oct 02 12:54:24 crc kubenswrapper[4929]: I1002 12:54:24.070553 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d79e2d7c5c0129171c1b70b983ae40fd7b66bb2a2285b6740e55f6b68bc0b47" Oct 02 12:54:24 crc kubenswrapper[4929]: I1002 12:54:24.070564 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wdlvs" Oct 02 12:54:25 crc kubenswrapper[4929]: I1002 12:54:25.406558 4929 scope.go:117] "RemoveContainer" containerID="27a037976d0d34792eb18d25b699807582081b85779e7560ae87d589a0faa9f0" Oct 02 12:54:25 crc kubenswrapper[4929]: I1002 12:54:25.493908 4929 scope.go:117] "RemoveContainer" containerID="814c660a3434c370a8db2d8f9c12de347b2267020c9a397d9e02fac00651c7bb" Oct 02 12:54:25 crc kubenswrapper[4929]: I1002 12:54:25.543790 4929 scope.go:117] "RemoveContainer" containerID="8d7df33ee47a37fe18ac3f14dad21af8031616527d1a176b9a5726dfe6628108" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.179648 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6dd94f695d-82pw4"] Oct 02 12:54:26 crc kubenswrapper[4929]: E1002 12:54:26.180408 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2729fe-14bc-44f9-bedb-e9745024b9d9" containerName="heat-db-sync" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.180430 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2729fe-14bc-44f9-bedb-e9745024b9d9" containerName="heat-db-sync" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.180705 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2729fe-14bc-44f9-bedb-e9745024b9d9" containerName="heat-db-sync" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.181493 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.186084 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.186340 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.189843 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-f2wcg" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.229083 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6dd94f695d-82pw4"] Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.300604 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data-custom\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.300995 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.301122 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkd7g\" (UniqueName: \"kubernetes.io/projected/2b95af5b-0003-4ffb-9c24-796e635a2252-kube-api-access-vkd7g\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.301146 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-combined-ca-bundle\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.363267 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-d4c9d7757-cmsx6"] Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.364799 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.367784 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.399240 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d4c9d7757-cmsx6"] Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.403389 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data-custom\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.403624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.405048 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkd7g\" (UniqueName: \"kubernetes.io/projected/2b95af5b-0003-4ffb-9c24-796e635a2252-kube-api-access-vkd7g\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.405088 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-combined-ca-bundle\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.409790 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-combined-ca-bundle\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.410586 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data-custom\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.426152 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b95af5b-0003-4ffb-9c24-796e635a2252-config-data\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.429884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkd7g\" (UniqueName: \"kubernetes.io/projected/2b95af5b-0003-4ffb-9c24-796e635a2252-kube-api-access-vkd7g\") pod \"heat-engine-6dd94f695d-82pw4\" (UID: \"2b95af5b-0003-4ffb-9c24-796e635a2252\") " pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.435029 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-d5d869965-b6fjv"] Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.436767 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.442277 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.453982 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d5d869965-b6fjv"] Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.506442 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507022 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-combined-ca-bundle\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data-custom\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507110 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data-custom\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507183 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz8dj\" (UniqueName: \"kubernetes.io/projected/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-kube-api-access-mz8dj\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507219 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpx9\" (UniqueName: \"kubernetes.io/projected/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-kube-api-access-zhpx9\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507237 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507324 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.507352 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-combined-ca-bundle\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609145 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data-custom\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609448 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8dj\" (UniqueName: \"kubernetes.io/projected/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-kube-api-access-mz8dj\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609494 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpx9\" (UniqueName: \"kubernetes.io/projected/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-kube-api-access-zhpx9\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609513 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609568 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609591 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-combined-ca-bundle\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609631 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-combined-ca-bundle\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.609655 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data-custom\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.630996 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-combined-ca-bundle\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.633949 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.635228 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpx9\" (UniqueName: \"kubernetes.io/projected/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-kube-api-access-zhpx9\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.638718 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-config-data-custom\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.639255 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.644809 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e0cc909-9c71-492c-9f3a-6bc2acdb8d31-config-data-custom\") pod \"heat-cfnapi-d5d869965-b6fjv\" (UID: \"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31\") " pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.649784 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8dj\" (UniqueName: \"kubernetes.io/projected/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-kube-api-access-mz8dj\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.653725 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f215210a-1592-4fc1-9dee-b9b0a3f6ed80-combined-ca-bundle\") pod \"heat-api-d4c9d7757-cmsx6\" (UID: \"f215210a-1592-4fc1-9dee-b9b0a3f6ed80\") " pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.686125 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.691560 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:54:26 crc kubenswrapper[4929]: I1002 12:54:26.827532 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.042429 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4baa-account-create-xxb26"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.061113 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2fa9-account-create-6rgp4"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.085618 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-94f9-account-create-zjvn2"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.103143 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4baa-account-create-xxb26"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.107838 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6dd94f695d-82pw4" event={"ID":"2b95af5b-0003-4ffb-9c24-796e635a2252","Type":"ContainerStarted","Data":"8a800cd9ffd2bd9e787eebdf144ad16b9eb6c90f4c20307037d6309bbdf79de7"} Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.115369 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-94f9-account-create-zjvn2"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.130784 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2fa9-account-create-6rgp4"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.140154 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6dd94f695d-82pw4"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.240215 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d4c9d7757-cmsx6"] Oct 02 12:54:27 crc kubenswrapper[4929]: I1002 12:54:27.390187 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d5d869965-b6fjv"] Oct 02 12:54:27 crc kubenswrapper[4929]: W1002 12:54:27.391012 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0cc909_9c71_492c_9f3a_6bc2acdb8d31.slice/crio-b77fb188d010ac2fa8949601082559ca6ca41a2a70c738c716e4fa0a1c290e19 WatchSource:0}: Error finding container b77fb188d010ac2fa8949601082559ca6ca41a2a70c738c716e4fa0a1c290e19: Status 404 returned error can't find the container with id b77fb188d010ac2fa8949601082559ca6ca41a2a70c738c716e4fa0a1c290e19 Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.130678 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6dd94f695d-82pw4" event={"ID":"2b95af5b-0003-4ffb-9c24-796e635a2252","Type":"ContainerStarted","Data":"bf519ce3b1bef7e0665c32dfc5ba3f7acc339cdee0f7f71c65e102afffb2f2ed"} Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.131840 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.133079 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d5d869965-b6fjv" event={"ID":"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31","Type":"ContainerStarted","Data":"b77fb188d010ac2fa8949601082559ca6ca41a2a70c738c716e4fa0a1c290e19"} Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.135275 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d4c9d7757-cmsx6" event={"ID":"f215210a-1592-4fc1-9dee-b9b0a3f6ed80","Type":"ContainerStarted","Data":"665bff5e919edfd424c490acc5068f7765bfa12f11df5943ba107e2bea1703be"} Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.150421 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6dd94f695d-82pw4" podStartSLOduration=2.15039594 podStartE2EDuration="2.15039594s" podCreationTimestamp="2025-10-02 12:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:54:28.14936549 +0000 UTC m=+6268.699731854" watchObservedRunningTime="2025-10-02 12:54:28.15039594 +0000 UTC m=+6268.700762314" Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.172429 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056a2f7c-05d0-413b-9fdd-52493121b1f4" path="/var/lib/kubelet/pods/056a2f7c-05d0-413b-9fdd-52493121b1f4/volumes" Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.173258 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0714e7b6-c0b5-419f-b517-d71ae7f2a6ae" path="/var/lib/kubelet/pods/0714e7b6-c0b5-419f-b517-d71ae7f2a6ae/volumes" Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.176250 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3644ee3-4a0a-4623-bf6d-c1b8302e8baa" path="/var/lib/kubelet/pods/f3644ee3-4a0a-4623-bf6d-c1b8302e8baa/volumes" Oct 02 12:54:28 crc kubenswrapper[4929]: I1002 12:54:28.934728 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64cf64777-tqdzr" Oct 02 12:54:29 crc kubenswrapper[4929]: I1002 12:54:29.057550 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:54:29 crc kubenswrapper[4929]: I1002 12:54:29.057816 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" containerID="cri-o://7c63ca0c526a29a6c0d15658320ef045d7311a910bf92a13dc5f952159bc0852" gracePeriod=30 Oct 02 12:54:29 crc kubenswrapper[4929]: I1002 12:54:29.058006 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon-log" containerID="cri-o://6cc546dbf828b7531d876935e21f75bda40766a034ed98564be7367940f44004" gracePeriod=30 Oct 02 12:54:30 crc kubenswrapper[4929]: I1002 12:54:30.176829 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d4c9d7757-cmsx6" event={"ID":"f215210a-1592-4fc1-9dee-b9b0a3f6ed80","Type":"ContainerStarted","Data":"9502e0b26cd85327865b32d52b1a46aa1cbac68dce15f39414bd34099924b7c2"} Oct 02 12:54:30 crc kubenswrapper[4929]: I1002 12:54:30.177211 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:30 crc kubenswrapper[4929]: I1002 12:54:30.178661 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d5d869965-b6fjv" event={"ID":"5e0cc909-9c71-492c-9f3a-6bc2acdb8d31","Type":"ContainerStarted","Data":"189993b110c73a74c428a8db800a0c7c6c0506ae728f4aff41398bc05b1bbe5a"} Oct 02 12:54:30 crc kubenswrapper[4929]: I1002 12:54:30.227884 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-d4c9d7757-cmsx6" podStartSLOduration=1.819040809 podStartE2EDuration="4.227864138s" podCreationTimestamp="2025-10-02 12:54:26 +0000 UTC" firstStartedPulling="2025-10-02 12:54:27.245886899 +0000 UTC m=+6267.796253263" lastFinishedPulling="2025-10-02 12:54:29.654710228 +0000 UTC m=+6270.205076592" observedRunningTime="2025-10-02 12:54:30.221902206 +0000 UTC m=+6270.772268570" watchObservedRunningTime="2025-10-02 12:54:30.227864138 +0000 UTC m=+6270.778230502" Oct 02 12:54:30 crc kubenswrapper[4929]: I1002 12:54:30.246838 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-d5d869965-b6fjv" podStartSLOduration=1.989449045 podStartE2EDuration="4.246815633s" podCreationTimestamp="2025-10-02 12:54:26 +0000 UTC" firstStartedPulling="2025-10-02 12:54:27.393571511 +0000 UTC m=+6267.943937875" lastFinishedPulling="2025-10-02 12:54:29.650938099 +0000 UTC m=+6270.201304463" observedRunningTime="2025-10-02 12:54:30.242594012 +0000 UTC m=+6270.792960366" watchObservedRunningTime="2025-10-02 12:54:30.246815633 +0000 UTC m=+6270.797181997" Oct 02 12:54:31 crc kubenswrapper[4929]: I1002 12:54:31.186569 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:33 crc kubenswrapper[4929]: I1002 12:54:33.205916 4929 generic.go:334] "Generic (PLEG): container finished" podID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerID="7c63ca0c526a29a6c0d15658320ef045d7311a910bf92a13dc5f952159bc0852" exitCode=0 Oct 02 12:54:33 crc kubenswrapper[4929]: I1002 12:54:33.205979 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerDied","Data":"7c63ca0c526a29a6c0d15658320ef045d7311a910bf92a13dc5f952159bc0852"} Oct 02 12:54:36 crc kubenswrapper[4929]: I1002 12:54:36.276239 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 02 12:54:37 crc kubenswrapper[4929]: I1002 12:54:37.030328 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gpx7r"] Oct 02 12:54:37 crc kubenswrapper[4929]: I1002 12:54:37.040363 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gpx7r"] Oct 02 12:54:37 crc kubenswrapper[4929]: I1002 12:54:37.157296 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:54:37 crc kubenswrapper[4929]: E1002 12:54:37.157563 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 12:54:38 crc kubenswrapper[4929]: I1002 12:54:38.099350 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-d4c9d7757-cmsx6" Oct 02 12:54:38 crc kubenswrapper[4929]: I1002 12:54:38.170261 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd7de5b-4bbd-42a0-b032-6177901cab75" path="/var/lib/kubelet/pods/cbd7de5b-4bbd-42a0-b032-6177901cab75/volumes" Oct 02 12:54:38 crc kubenswrapper[4929]: I1002 12:54:38.234269 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-d5d869965-b6fjv" Oct 02 12:54:46 crc kubenswrapper[4929]: I1002 12:54:46.274903 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 02 12:54:46 crc kubenswrapper[4929]: I1002 12:54:46.540952 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6dd94f695d-82pw4" Oct 02 12:54:51 crc kubenswrapper[4929]: I1002 12:54:51.157684 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:54:52 crc kubenswrapper[4929]: I1002 12:54:52.386589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa"} Oct 02 12:54:55 crc kubenswrapper[4929]: I1002 12:54:55.929442 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k"] Oct 02 12:54:55 crc kubenswrapper[4929]: I1002 12:54:55.932265 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:55 crc kubenswrapper[4929]: I1002 12:54:55.937446 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 12:54:55 crc kubenswrapper[4929]: I1002 12:54:55.948728 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k"] Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.040848 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tp2vw"] Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.049916 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xn629"] Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.059102 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tp2vw"] Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.067191 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xn629"] Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.076423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pd85\" (UniqueName: \"kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.076555 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.076630 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.169335 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c4613a7-f848-47e6-9ada-737b3de390d9" path="/var/lib/kubelet/pods/3c4613a7-f848-47e6-9ada-737b3de390d9/volumes" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.170711 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf080ed-d0fb-4e55-8a95-11964070b99a" path="/var/lib/kubelet/pods/abf080ed-d0fb-4e55-8a95-11964070b99a/volumes" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.178272 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.178400 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pd85\" (UniqueName: \"kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.178475 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.178878 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.178902 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.198360 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pd85\" (UniqueName: \"kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.256230 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.278503 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-845cb5b59c-bsq8g" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.278892 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:54:56 crc kubenswrapper[4929]: I1002 12:54:56.695419 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k"] Oct 02 12:54:57 crc kubenswrapper[4929]: I1002 12:54:57.434030 4929 generic.go:334] "Generic (PLEG): container finished" podID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerID="cf781f0797ab3e82566afe33a35d03e6e7e33bb4e38d81ade464514b42b84816" exitCode=0 Oct 02 12:54:57 crc kubenswrapper[4929]: I1002 12:54:57.434185 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" event={"ID":"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255","Type":"ContainerDied","Data":"cf781f0797ab3e82566afe33a35d03e6e7e33bb4e38d81ade464514b42b84816"} Oct 02 12:54:57 crc kubenswrapper[4929]: I1002 12:54:57.434621 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" event={"ID":"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255","Type":"ContainerStarted","Data":"9840253653d00dc749d5a0aa3cc5e61b0f1bbdf751823c929494d2b4e2eb8beb"} Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.462056 4929 generic.go:334] "Generic (PLEG): container finished" podID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerID="6cc546dbf828b7531d876935e21f75bda40766a034ed98564be7367940f44004" exitCode=137 Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.462243 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerDied","Data":"6cc546dbf828b7531d876935e21f75bda40766a034ed98564be7367940f44004"} Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.462681 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-845cb5b59c-bsq8g" event={"ID":"3e80fb3d-e18a-4031-8c75-921bf624a93e","Type":"ContainerDied","Data":"eaec3f50b16acf1d6b6785a73703f251e4de6f332302d41431bdbb751dc7fe0d"} Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.462705 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaec3f50b16acf1d6b6785a73703f251e4de6f332302d41431bdbb751dc7fe0d" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.465138 4929 generic.go:334] "Generic (PLEG): container finished" podID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerID="880233e93d1adfe478f070409064ede767b98df942b4f7b2aedcec655640ead2" exitCode=0 Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.465173 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" event={"ID":"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255","Type":"ContainerDied","Data":"880233e93d1adfe478f070409064ede767b98df942b4f7b2aedcec655640ead2"} Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.491188 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.551340 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts\") pod \"3e80fb3d-e18a-4031-8c75-921bf624a93e\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.551394 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key\") pod \"3e80fb3d-e18a-4031-8c75-921bf624a93e\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.551471 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs\") pod \"3e80fb3d-e18a-4031-8c75-921bf624a93e\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.551593 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data\") pod \"3e80fb3d-e18a-4031-8c75-921bf624a93e\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.551640 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4s5q\" (UniqueName: \"kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q\") pod \"3e80fb3d-e18a-4031-8c75-921bf624a93e\" (UID: \"3e80fb3d-e18a-4031-8c75-921bf624a93e\") " Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.552844 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs" (OuterVolumeSpecName: "logs") pod "3e80fb3d-e18a-4031-8c75-921bf624a93e" (UID: "3e80fb3d-e18a-4031-8c75-921bf624a93e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.556620 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3e80fb3d-e18a-4031-8c75-921bf624a93e" (UID: "3e80fb3d-e18a-4031-8c75-921bf624a93e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.556659 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q" (OuterVolumeSpecName: "kube-api-access-s4s5q") pod "3e80fb3d-e18a-4031-8c75-921bf624a93e" (UID: "3e80fb3d-e18a-4031-8c75-921bf624a93e"). InnerVolumeSpecName "kube-api-access-s4s5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.577921 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data" (OuterVolumeSpecName: "config-data") pod "3e80fb3d-e18a-4031-8c75-921bf624a93e" (UID: "3e80fb3d-e18a-4031-8c75-921bf624a93e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.579030 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts" (OuterVolumeSpecName: "scripts") pod "3e80fb3d-e18a-4031-8c75-921bf624a93e" (UID: "3e80fb3d-e18a-4031-8c75-921bf624a93e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.654208 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.654239 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4s5q\" (UniqueName: \"kubernetes.io/projected/3e80fb3d-e18a-4031-8c75-921bf624a93e-kube-api-access-s4s5q\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.654250 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e80fb3d-e18a-4031-8c75-921bf624a93e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.654260 4929 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e80fb3d-e18a-4031-8c75-921bf624a93e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:59 crc kubenswrapper[4929]: I1002 12:54:59.654304 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e80fb3d-e18a-4031-8c75-921bf624a93e-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:00 crc kubenswrapper[4929]: I1002 12:55:00.475833 4929 generic.go:334] "Generic (PLEG): container finished" podID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerID="d57f9f26c78d9177680d64dd5a2f8995848c6f38c47a26f76765e4ef94bc274f" exitCode=0 Oct 02 12:55:00 crc kubenswrapper[4929]: I1002 12:55:00.475935 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" event={"ID":"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255","Type":"ContainerDied","Data":"d57f9f26c78d9177680d64dd5a2f8995848c6f38c47a26f76765e4ef94bc274f"} Oct 02 12:55:00 crc kubenswrapper[4929]: I1002 12:55:00.476434 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-845cb5b59c-bsq8g" Oct 02 12:55:00 crc kubenswrapper[4929]: I1002 12:55:00.519430 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:55:00 crc kubenswrapper[4929]: I1002 12:55:00.527680 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-845cb5b59c-bsq8g"] Oct 02 12:55:01 crc kubenswrapper[4929]: I1002 12:55:01.932770 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.002244 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pd85\" (UniqueName: \"kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85\") pod \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.002565 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util\") pod \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.002724 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle\") pod \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\" (UID: \"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255\") " Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.004678 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle" (OuterVolumeSpecName: "bundle") pod "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" (UID: "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.010082 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85" (OuterVolumeSpecName: "kube-api-access-7pd85") pod "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" (UID: "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255"). InnerVolumeSpecName "kube-api-access-7pd85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.012872 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util" (OuterVolumeSpecName: "util") pod "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" (UID: "5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.105454 4929 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.105490 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pd85\" (UniqueName: \"kubernetes.io/projected/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-kube-api-access-7pd85\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.105501 4929 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255-util\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.171147 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" path="/var/lib/kubelet/pods/3e80fb3d-e18a-4031-8c75-921bf624a93e/volumes" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.497101 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" event={"ID":"5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255","Type":"ContainerDied","Data":"9840253653d00dc749d5a0aa3cc5e61b0f1bbdf751823c929494d2b4e2eb8beb"} Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.497628 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9840253653d00dc749d5a0aa3cc5e61b0f1bbdf751823c929494d2b4e2eb8beb" Oct 02 12:55:02 crc kubenswrapper[4929]: I1002 12:55:02.497314 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k" Oct 02 12:55:10 crc kubenswrapper[4929]: E1002 12:55:10.209396 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e80fb3d_e18a_4031_8c75_921bf624a93e.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.053974 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8w2p"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.063264 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-x8w2p"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.751368 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5"] Oct 02 12:55:15 crc kubenswrapper[4929]: E1002 12:55:15.751776 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="util" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.751792 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="util" Oct 02 12:55:15 crc kubenswrapper[4929]: E1002 12:55:15.751803 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.751811 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" Oct 02 12:55:15 crc kubenswrapper[4929]: E1002 12:55:15.751840 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon-log" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.751893 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon-log" Oct 02 12:55:15 crc kubenswrapper[4929]: E1002 12:55:15.751930 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="extract" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.751935 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="extract" Oct 02 12:55:15 crc kubenswrapper[4929]: E1002 12:55:15.752006 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="pull" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.752016 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="pull" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.752209 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.752222 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255" containerName="extract" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.752237 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e80fb3d-e18a-4031-8c75-921bf624a93e" containerName="horizon-log" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.753056 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.765550 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.766740 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.766951 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.767181 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rnqxg" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.905270 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.907510 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.910575 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.910854 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-p94jf" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.912538 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brt5r\" (UniqueName: \"kubernetes.io/projected/efb8bd58-0f3b-44ec-900a-12e77329a35d-kube-api-access-brt5r\") pod \"obo-prometheus-operator-7c8cf85677-qrjq5\" (UID: \"efb8bd58-0f3b-44ec-900a-12e77329a35d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.914902 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.924104 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8"] Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.926063 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:15 crc kubenswrapper[4929]: I1002 12:55:15.938499 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.014324 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.014380 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.014417 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brt5r\" (UniqueName: \"kubernetes.io/projected/efb8bd58-0f3b-44ec-900a-12e77329a35d-kube-api-access-brt5r\") pod \"obo-prometheus-operator-7c8cf85677-qrjq5\" (UID: \"efb8bd58-0f3b-44ec-900a-12e77329a35d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.014473 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.014551 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.041761 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brt5r\" (UniqueName: \"kubernetes.io/projected/efb8bd58-0f3b-44ec-900a-12e77329a35d-kube-api-access-brt5r\") pod \"obo-prometheus-operator-7c8cf85677-qrjq5\" (UID: \"efb8bd58-0f3b-44ec-900a-12e77329a35d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.068564 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fmst2"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.070258 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.075358 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.075690 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-g4h5q" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.098076 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fmst2"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.117339 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.117441 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.117539 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.117563 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.125842 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.127920 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.128684 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/068f6f40-2d52-4600-a441-df5ca2543dad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c\" (UID: \"068f6f40-2d52-4600-a441-df5ca2543dad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.135350 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.136451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/37fba1c2-0ff5-44e6-8192-14c4ba4c4e22-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8\" (UID: \"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.182610 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d855a0-eeae-434a-afbd-dc37924f454f" path="/var/lib/kubelet/pods/34d855a0-eeae-434a-afbd-dc37924f454f/volumes" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.215790 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bqlh9"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.217359 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.219061 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zps\" (UniqueName: \"kubernetes.io/projected/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-kube-api-access-c9zps\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.219140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.219912 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gb6wk" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.235212 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.247189 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bqlh9"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.253678 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.324337 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z59bb\" (UniqueName: \"kubernetes.io/projected/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-kube-api-access-z59bb\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.325493 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.325612 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zps\" (UniqueName: \"kubernetes.io/projected/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-kube-api-access-c9zps\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.325749 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.344184 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.344240 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zps\" (UniqueName: \"kubernetes.io/projected/3f310c2b-2d4a-4538-a08e-d87f3da76b2f-kube-api-access-c9zps\") pod \"observability-operator-cc5f78dfc-fmst2\" (UID: \"3f310c2b-2d4a-4538-a08e-d87f3da76b2f\") " pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.371539 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.430210 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z59bb\" (UniqueName: \"kubernetes.io/projected/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-kube-api-access-z59bb\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.430332 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.431203 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.456065 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z59bb\" (UniqueName: \"kubernetes.io/projected/04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa-kube-api-access-z59bb\") pod \"perses-operator-54bc95c9fb-bqlh9\" (UID: \"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa\") " pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.695735 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.837989 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5"] Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.927101 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c"] Oct 02 12:55:16 crc kubenswrapper[4929]: W1002 12:55:16.931795 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod068f6f40_2d52_4600_a441_df5ca2543dad.slice/crio-b90eb75ad31ce26c473863b66c2450b0415d35c2ba0b362d1bc88df93ce7884c WatchSource:0}: Error finding container b90eb75ad31ce26c473863b66c2450b0415d35c2ba0b362d1bc88df93ce7884c: Status 404 returned error can't find the container with id b90eb75ad31ce26c473863b66c2450b0415d35c2ba0b362d1bc88df93ce7884c Oct 02 12:55:16 crc kubenswrapper[4929]: I1002 12:55:16.942900 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8"] Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.077254 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-fmst2"] Oct 02 12:55:17 crc kubenswrapper[4929]: W1002 12:55:17.112637 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f310c2b_2d4a_4538_a08e_d87f3da76b2f.slice/crio-7484c0644d6893ead6dc8c5c16ff895f647f872fde1794566d6451ce601d3208 WatchSource:0}: Error finding container 7484c0644d6893ead6dc8c5c16ff895f647f872fde1794566d6451ce601d3208: Status 404 returned error can't find the container with id 7484c0644d6893ead6dc8c5c16ff895f647f872fde1794566d6451ce601d3208 Oct 02 12:55:17 crc kubenswrapper[4929]: W1002 12:55:17.290756 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04cddf71_5b78_47a3_a3a9_a7e65d8fe1aa.slice/crio-c1be1e0628bc63caf1f10caa0e3d5d059bf509a14cab11d1ad1e65b104f461be WatchSource:0}: Error finding container c1be1e0628bc63caf1f10caa0e3d5d059bf509a14cab11d1ad1e65b104f461be: Status 404 returned error can't find the container with id c1be1e0628bc63caf1f10caa0e3d5d059bf509a14cab11d1ad1e65b104f461be Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.312272 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-bqlh9"] Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.683643 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" event={"ID":"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22","Type":"ContainerStarted","Data":"d7f6dd0a44963f798386c9c397dd2be04c3087dda72c38b33e453402f33fb313"} Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.686238 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" event={"ID":"3f310c2b-2d4a-4538-a08e-d87f3da76b2f","Type":"ContainerStarted","Data":"7484c0644d6893ead6dc8c5c16ff895f647f872fde1794566d6451ce601d3208"} Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.687722 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" event={"ID":"068f6f40-2d52-4600-a441-df5ca2543dad","Type":"ContainerStarted","Data":"b90eb75ad31ce26c473863b66c2450b0415d35c2ba0b362d1bc88df93ce7884c"} Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.689521 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" event={"ID":"efb8bd58-0f3b-44ec-900a-12e77329a35d","Type":"ContainerStarted","Data":"bf9e6fde1b4eff912dc1c052746a9f710b71903814ba0c0122e8801fdcf5d91e"} Oct 02 12:55:17 crc kubenswrapper[4929]: I1002 12:55:17.691722 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" event={"ID":"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa","Type":"ContainerStarted","Data":"c1be1e0628bc63caf1f10caa0e3d5d059bf509a14cab11d1ad1e65b104f461be"} Oct 02 12:55:20 crc kubenswrapper[4929]: E1002 12:55:20.500671 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e80fb3d_e18a_4031_8c75_921bf624a93e.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.684673 4929 scope.go:117] "RemoveContainer" containerID="8fe17b090a5d90d1f7cc5553174538ccac24094b91760141b50d2ee4fe9ffa42" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.705410 4929 scope.go:117] "RemoveContainer" containerID="c3bb585c8a289d60b72eab1ded792bcece5a7c86bce449978a083b6ad756b47f" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.777445 4929 scope.go:117] "RemoveContainer" containerID="dbdd91ae8e0158aaeab352d333673767a30d5b08904d28e9bc076b14dd10e2f9" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.813737 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" event={"ID":"37fba1c2-0ff5-44e6-8192-14c4ba4c4e22","Type":"ContainerStarted","Data":"2312796caab8ba2e3e13c2bb977faa41a0fed916a1fdd5fefb60567d8f9e3598"} Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.816636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" event={"ID":"3f310c2b-2d4a-4538-a08e-d87f3da76b2f","Type":"ContainerStarted","Data":"29b23af10a0687acafaa4d213fbd377d1cf32c29df2aa8ec75e35e5fc995e415"} Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.817857 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.823441 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.833282 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" event={"ID":"068f6f40-2d52-4600-a441-df5ca2543dad","Type":"ContainerStarted","Data":"92592e92d40aa11a87e08a4104c4b5054c2a003ce9ebbc661cfdc0df045c015b"} Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.835496 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8" podStartSLOduration=3.326447869 podStartE2EDuration="10.835477398s" podCreationTimestamp="2025-10-02 12:55:15 +0000 UTC" firstStartedPulling="2025-10-02 12:55:16.947825799 +0000 UTC m=+6317.498192153" lastFinishedPulling="2025-10-02 12:55:24.456855328 +0000 UTC m=+6325.007221682" observedRunningTime="2025-10-02 12:55:25.833511201 +0000 UTC m=+6326.383877565" watchObservedRunningTime="2025-10-02 12:55:25.835477398 +0000 UTC m=+6326.385843762" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.839732 4929 scope.go:117] "RemoveContainer" containerID="7609d508a51d8c5f8eb4128611d34cbc3442a69f30a4f7a4b4d17a52f46926c0" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.841786 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" event={"ID":"efb8bd58-0f3b-44ec-900a-12e77329a35d","Type":"ContainerStarted","Data":"ebdcefe5766fa7e071ca500267ee2c738141fda8618abb289a3b6acb4c4cd71f"} Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.857519 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c" podStartSLOduration=3.347198496 podStartE2EDuration="10.857504102s" podCreationTimestamp="2025-10-02 12:55:15 +0000 UTC" firstStartedPulling="2025-10-02 12:55:16.935758281 +0000 UTC m=+6317.486124645" lastFinishedPulling="2025-10-02 12:55:24.446063887 +0000 UTC m=+6324.996430251" observedRunningTime="2025-10-02 12:55:25.85675094 +0000 UTC m=+6326.407117304" watchObservedRunningTime="2025-10-02 12:55:25.857504102 +0000 UTC m=+6326.407870466" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.858913 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" event={"ID":"04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa","Type":"ContainerStarted","Data":"098bcb292e6cbec2f6b2d5da8697c79a098bcf6f4197b01d1f7e54fb14388111"} Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.859903 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.913588 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-fmst2" podStartSLOduration=2.533930202 podStartE2EDuration="9.913562096s" podCreationTimestamp="2025-10-02 12:55:16 +0000 UTC" firstStartedPulling="2025-10-02 12:55:17.116054052 +0000 UTC m=+6317.666420416" lastFinishedPulling="2025-10-02 12:55:24.495685946 +0000 UTC m=+6325.046052310" observedRunningTime="2025-10-02 12:55:25.88244493 +0000 UTC m=+6326.432811294" watchObservedRunningTime="2025-10-02 12:55:25.913562096 +0000 UTC m=+6326.463928460" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.921012 4929 scope.go:117] "RemoveContainer" containerID="9a9f31e9a027290df783b4e7a44c54fe1d1355201d4e3e480f3ef40f1a34afe6" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.957533 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qrjq5" podStartSLOduration=3.366358008 podStartE2EDuration="10.957512921s" podCreationTimestamp="2025-10-02 12:55:15 +0000 UTC" firstStartedPulling="2025-10-02 12:55:16.863375538 +0000 UTC m=+6317.413741902" lastFinishedPulling="2025-10-02 12:55:24.454530441 +0000 UTC m=+6325.004896815" observedRunningTime="2025-10-02 12:55:25.929231717 +0000 UTC m=+6326.479598081" watchObservedRunningTime="2025-10-02 12:55:25.957512921 +0000 UTC m=+6326.507879285" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.990789 4929 scope.go:117] "RemoveContainer" containerID="c8e3678828060ab54ec90381b477ddf8e2c2d851b350911e16caad3cc7f2449a" Oct 02 12:55:25 crc kubenswrapper[4929]: I1002 12:55:25.991291 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" podStartSLOduration=2.833392333 podStartE2EDuration="9.991270403s" podCreationTimestamp="2025-10-02 12:55:16 +0000 UTC" firstStartedPulling="2025-10-02 12:55:17.293727267 +0000 UTC m=+6317.844093621" lastFinishedPulling="2025-10-02 12:55:24.451605327 +0000 UTC m=+6325.001971691" observedRunningTime="2025-10-02 12:55:25.953354191 +0000 UTC m=+6326.503720555" watchObservedRunningTime="2025-10-02 12:55:25.991270403 +0000 UTC m=+6326.541636767" Oct 02 12:55:26 crc kubenswrapper[4929]: I1002 12:55:26.049413 4929 scope.go:117] "RemoveContainer" containerID="33b43bf7f91b8e5ef8dd5aed1fad0674b63494091738c29da35c12a0d0620d1a" Oct 02 12:55:30 crc kubenswrapper[4929]: E1002 12:55:30.822927 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e80fb3d_e18a_4031_8c75_921bf624a93e.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:55:36 crc kubenswrapper[4929]: I1002 12:55:36.698467 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-bqlh9" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.145369 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.146126 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" containerName="openstackclient" containerID="cri-o://1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75" gracePeriod=2 Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.160800 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.207515 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 12:55:39 crc kubenswrapper[4929]: E1002 12:55:39.207940 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" containerName="openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.207952 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" containerName="openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.208171 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" containerName="openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.208838 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.216833 4929 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" podUID="6c22dd30-f774-4d88-8e74-70a1dac9c474" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.230893 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.308292 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gt76\" (UniqueName: \"kubernetes.io/projected/6c22dd30-f774-4d88-8e74-70a1dac9c474-kube-api-access-5gt76\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.308360 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.308509 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.413458 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gt76\" (UniqueName: \"kubernetes.io/projected/6c22dd30-f774-4d88-8e74-70a1dac9c474-kube-api-access-5gt76\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.413504 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.413578 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.415456 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.440744 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gt76\" (UniqueName: \"kubernetes.io/projected/6c22dd30-f774-4d88-8e74-70a1dac9c474-kube-api-access-5gt76\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.441084 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c22dd30-f774-4d88-8e74-70a1dac9c474-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c22dd30-f774-4d88-8e74-70a1dac9c474\") " pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.519099 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.520705 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.540085 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.542096 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.564373 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-545tm" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.625214 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzb2t\" (UniqueName: \"kubernetes.io/projected/d79f4d99-93e0-46d9-ac92-9ef18cbc992c-kube-api-access-dzb2t\") pod \"kube-state-metrics-0\" (UID: \"d79f4d99-93e0-46d9-ac92-9ef18cbc992c\") " pod="openstack/kube-state-metrics-0" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.727643 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzb2t\" (UniqueName: \"kubernetes.io/projected/d79f4d99-93e0-46d9-ac92-9ef18cbc992c-kube-api-access-dzb2t\") pod \"kube-state-metrics-0\" (UID: \"d79f4d99-93e0-46d9-ac92-9ef18cbc992c\") " pod="openstack/kube-state-metrics-0" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.775245 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzb2t\" (UniqueName: \"kubernetes.io/projected/d79f4d99-93e0-46d9-ac92-9ef18cbc992c-kube-api-access-dzb2t\") pod \"kube-state-metrics-0\" (UID: \"d79f4d99-93e0-46d9-ac92-9ef18cbc992c\") " pod="openstack/kube-state-metrics-0" Oct 02 12:55:39 crc kubenswrapper[4929]: I1002 12:55:39.982424 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.308787 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.311913 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.316690 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.316812 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.316989 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-9qgd7" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.317101 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.340683 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453363 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453447 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453486 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453549 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgqc\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-kube-api-access-2bgqc\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453601 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.453635 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556254 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556329 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556400 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgqc\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-kube-api-access-2bgqc\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556457 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556503 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.556588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.558117 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.563158 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.567168 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/da433ef9-9937-45f5-b5ef-121eed623b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.567923 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.568712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/da433ef9-9937-45f5-b5ef-121eed623b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.582119 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.594417 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgqc\" (UniqueName: \"kubernetes.io/projected/da433ef9-9937-45f5-b5ef-121eed623b2f-kube-api-access-2bgqc\") pod \"alertmanager-metric-storage-0\" (UID: \"da433ef9-9937-45f5-b5ef-121eed623b2f\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.653596 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.959403 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.975850 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.993006 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.993276 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.993380 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.993477 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-k2tph" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.995085 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 12:55:40 crc kubenswrapper[4929]: I1002 12:55:40.995154 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.006915 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.065351 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077325 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077417 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077485 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077534 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8fceab33-2fbd-4885-8d95-87f1a28c9c65-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077563 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdzb\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-kube-api-access-rhdzb\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077586 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077646 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077670 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.077680 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d79f4d99-93e0-46d9-ac92-9ef18cbc992c","Type":"ContainerStarted","Data":"8301248b277e5d173d304990b260ddfb1c1cdeea5379c099965e68df6050c1e8"} Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.082502 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c22dd30-f774-4d88-8e74-70a1dac9c474","Type":"ContainerStarted","Data":"36cf1eee38a9e11841b71bd6cc349396366ef0314496c13c8a2b9c0ea1017d39"} Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180636 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180740 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8fceab33-2fbd-4885-8d95-87f1a28c9c65-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180783 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdzb\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-kube-api-access-rhdzb\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180811 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180885 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.180915 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.181050 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.181130 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.188709 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8fceab33-2fbd-4885-8d95-87f1a28c9c65-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.201214 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.213153 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.213296 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.214373 4929 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.214406 4929 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2c251b1568e3bdd3152d014cc4a0eccd8fbbf317e7336f479ac4134368c56000/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.219778 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.220540 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fceab33-2fbd-4885-8d95-87f1a28c9c65-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.239796 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdzb\" (UniqueName: \"kubernetes.io/projected/8fceab33-2fbd-4885-8d95-87f1a28c9c65-kube-api-access-rhdzb\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: E1002 12:55:41.313074 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e80fb3d_e18a_4031_8c75_921bf624a93e.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.439896 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dc39196d-2c3c-4507-ad42-b99138c2873b\") pod \"prometheus-metric-storage-0\" (UID: \"8fceab33-2fbd-4885-8d95-87f1a28c9c65\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.607918 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.696976 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.962328 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:55:41 crc kubenswrapper[4929]: I1002 12:55:41.966923 4929 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" podUID="6c22dd30-f774-4d88-8e74-70a1dac9c474" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.061865 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config\") pod \"071cc895-9a58-4f01-90cd-f0095a6b0f22\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.062336 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret\") pod \"071cc895-9a58-4f01-90cd-f0095a6b0f22\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.062611 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdpd\" (UniqueName: \"kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd\") pod \"071cc895-9a58-4f01-90cd-f0095a6b0f22\" (UID: \"071cc895-9a58-4f01-90cd-f0095a6b0f22\") " Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.083128 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd" (OuterVolumeSpecName: "kube-api-access-tgdpd") pod "071cc895-9a58-4f01-90cd-f0095a6b0f22" (UID: "071cc895-9a58-4f01-90cd-f0095a6b0f22"). InnerVolumeSpecName "kube-api-access-tgdpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.099817 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "071cc895-9a58-4f01-90cd-f0095a6b0f22" (UID: "071cc895-9a58-4f01-90cd-f0095a6b0f22"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.118105 4929 generic.go:334] "Generic (PLEG): container finished" podID="071cc895-9a58-4f01-90cd-f0095a6b0f22" containerID="1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75" exitCode=137 Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.118224 4929 scope.go:117] "RemoveContainer" containerID="1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.118305 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.124301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da433ef9-9937-45f5-b5ef-121eed623b2f","Type":"ContainerStarted","Data":"2510748da2247e25b4796c802bd2e27e67c20882f607be50296f9de3429292f7"} Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.124580 4929 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" podUID="6c22dd30-f774-4d88-8e74-70a1dac9c474" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.156114 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "071cc895-9a58-4f01-90cd-f0095a6b0f22" (UID: "071cc895-9a58-4f01-90cd-f0095a6b0f22"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.168349 4929 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.168381 4929 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/071cc895-9a58-4f01-90cd-f0095a6b0f22-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.168392 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgdpd\" (UniqueName: \"kubernetes.io/projected/071cc895-9a58-4f01-90cd-f0095a6b0f22-kube-api-access-tgdpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.183265 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" path="/var/lib/kubelet/pods/071cc895-9a58-4f01-90cd-f0095a6b0f22/volumes" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.426175 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.475092 4929 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="071cc895-9a58-4f01-90cd-f0095a6b0f22" podUID="6c22dd30-f774-4d88-8e74-70a1dac9c474" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.779427 4929 scope.go:117] "RemoveContainer" containerID="1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75" Oct 02 12:55:42 crc kubenswrapper[4929]: E1002 12:55:42.780869 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75\": container with ID starting with 1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75 not found: ID does not exist" containerID="1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75" Oct 02 12:55:42 crc kubenswrapper[4929]: I1002 12:55:42.780922 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75"} err="failed to get container status \"1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75\": rpc error: code = NotFound desc = could not find container \"1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75\": container with ID starting with 1379a7d635ecf564a7a281acaadb4ea64b10aeb6e15c342423111f6b89241e75 not found: ID does not exist" Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.135282 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c22dd30-f774-4d88-8e74-70a1dac9c474","Type":"ContainerStarted","Data":"8e9f9aea0d86a1024753f0654be29f864f6a06ca0b11b1bfe82cb48df1f52048"} Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.139874 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d79f4d99-93e0-46d9-ac92-9ef18cbc992c","Type":"ContainerStarted","Data":"2d0a32fb8834cd86cee5218916d605d81b276515db306040cf62bdac6133ec74"} Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.140026 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.143581 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerStarted","Data":"8ba431b8266b08d356b3e7e58171449e733934e45d2fd286c5b5629fa01ba53e"} Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.169001 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.168952709 podStartE2EDuration="4.168952709s" podCreationTimestamp="2025-10-02 12:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:55:43.158577809 +0000 UTC m=+6343.708944183" watchObservedRunningTime="2025-10-02 12:55:43.168952709 +0000 UTC m=+6343.719319063" Oct 02 12:55:43 crc kubenswrapper[4929]: I1002 12:55:43.189644 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.408269978 podStartE2EDuration="4.189625176s" podCreationTimestamp="2025-10-02 12:55:39 +0000 UTC" firstStartedPulling="2025-10-02 12:55:41.041116194 +0000 UTC m=+6341.591482558" lastFinishedPulling="2025-10-02 12:55:42.822471392 +0000 UTC m=+6343.372837756" observedRunningTime="2025-10-02 12:55:43.180825142 +0000 UTC m=+6343.731191526" watchObservedRunningTime="2025-10-02 12:55:43.189625176 +0000 UTC m=+6343.739991540" Oct 02 12:55:48 crc kubenswrapper[4929]: I1002 12:55:48.190790 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerStarted","Data":"ce1a8868e8f59f859b647fc6b2ace186da31e20ffbc1d6feac69d39bbe6f5283"} Oct 02 12:55:48 crc kubenswrapper[4929]: I1002 12:55:48.192815 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da433ef9-9937-45f5-b5ef-121eed623b2f","Type":"ContainerStarted","Data":"1af5e80c1e84665ce314251857f0b903b6b5bb08f61f36450cd11f6fb2962731"} Oct 02 12:55:50 crc kubenswrapper[4929]: I1002 12:55:50.045856 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 12:55:51 crc kubenswrapper[4929]: E1002 12:55:51.583761 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e80fb3d_e18a_4031_8c75_921bf624a93e.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:55:53 crc kubenswrapper[4929]: I1002 12:55:53.240224 4929 generic.go:334] "Generic (PLEG): container finished" podID="8fceab33-2fbd-4885-8d95-87f1a28c9c65" containerID="ce1a8868e8f59f859b647fc6b2ace186da31e20ffbc1d6feac69d39bbe6f5283" exitCode=0 Oct 02 12:55:53 crc kubenswrapper[4929]: I1002 12:55:53.240281 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerDied","Data":"ce1a8868e8f59f859b647fc6b2ace186da31e20ffbc1d6feac69d39bbe6f5283"} Oct 02 12:55:53 crc kubenswrapper[4929]: I1002 12:55:53.241835 4929 generic.go:334] "Generic (PLEG): container finished" podID="da433ef9-9937-45f5-b5ef-121eed623b2f" containerID="1af5e80c1e84665ce314251857f0b903b6b5bb08f61f36450cd11f6fb2962731" exitCode=0 Oct 02 12:55:53 crc kubenswrapper[4929]: I1002 12:55:53.241872 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da433ef9-9937-45f5-b5ef-121eed623b2f","Type":"ContainerDied","Data":"1af5e80c1e84665ce314251857f0b903b6b5bb08f61f36450cd11f6fb2962731"} Oct 02 12:55:59 crc kubenswrapper[4929]: I1002 12:55:59.037867 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-f62zg"] Oct 02 12:55:59 crc kubenswrapper[4929]: I1002 12:55:59.046710 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-f62zg"] Oct 02 12:56:00 crc kubenswrapper[4929]: I1002 12:56:00.170167 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9390df5a-244b-456d-b23a-be1ddcac2bc5" path="/var/lib/kubelet/pods/9390df5a-244b-456d-b23a-be1ddcac2bc5/volumes" Oct 02 12:56:00 crc kubenswrapper[4929]: I1002 12:56:00.317618 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerStarted","Data":"5d83f6febfea88bd5db68ba5748fabdada34dc7803e51a32c0d892fd9dfbe3e5"} Oct 02 12:56:00 crc kubenswrapper[4929]: I1002 12:56:00.320442 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da433ef9-9937-45f5-b5ef-121eed623b2f","Type":"ContainerStarted","Data":"d2af7b5f97cfd6daeaae0155ae2e4c9ce56e3f239e437584f4bf619c66e89a45"} Oct 02 12:56:03 crc kubenswrapper[4929]: I1002 12:56:03.358587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerStarted","Data":"21d7d599a6fec9274739331d3861f490cc7149fccb107b1b5e17a49519835e34"} Oct 02 12:56:03 crc kubenswrapper[4929]: I1002 12:56:03.361323 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"da433ef9-9937-45f5-b5ef-121eed623b2f","Type":"ContainerStarted","Data":"83160d88b7735e416767146f0ade5cb540d1182502ea62a74b52c9aa3e0f3301"} Oct 02 12:56:03 crc kubenswrapper[4929]: I1002 12:56:03.361737 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:56:03 crc kubenswrapper[4929]: I1002 12:56:03.364216 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:56:03 crc kubenswrapper[4929]: I1002 12:56:03.383777 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.023907496 podStartE2EDuration="23.38375837s" podCreationTimestamp="2025-10-02 12:55:40 +0000 UTC" firstStartedPulling="2025-10-02 12:55:41.696133061 +0000 UTC m=+6342.246499425" lastFinishedPulling="2025-10-02 12:55:59.055983935 +0000 UTC m=+6359.606350299" observedRunningTime="2025-10-02 12:56:03.381554487 +0000 UTC m=+6363.931920851" watchObservedRunningTime="2025-10-02 12:56:03.38375837 +0000 UTC m=+6363.934124734" Oct 02 12:56:06 crc kubenswrapper[4929]: I1002 12:56:06.395692 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8fceab33-2fbd-4885-8d95-87f1a28c9c65","Type":"ContainerStarted","Data":"fd41d46df248d6bfb32d0a07ebfd95d83f63bdef4b0e2d05cd3ff23dab307c4f"} Oct 02 12:56:06 crc kubenswrapper[4929]: I1002 12:56:06.427946 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.7568594090000005 podStartE2EDuration="27.427926623s" podCreationTimestamp="2025-10-02 12:55:39 +0000 UTC" firstStartedPulling="2025-10-02 12:55:42.789411707 +0000 UTC m=+6343.339778071" lastFinishedPulling="2025-10-02 12:56:05.460478921 +0000 UTC m=+6366.010845285" observedRunningTime="2025-10-02 12:56:06.424890575 +0000 UTC m=+6366.975256939" watchObservedRunningTime="2025-10-02 12:56:06.427926623 +0000 UTC m=+6366.978292987" Oct 02 12:56:06 crc kubenswrapper[4929]: I1002 12:56:06.697126 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 12:56:09 crc kubenswrapper[4929]: I1002 12:56:09.031597 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-92b0-account-create-8sm79"] Oct 02 12:56:09 crc kubenswrapper[4929]: I1002 12:56:09.041936 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-92b0-account-create-8sm79"] Oct 02 12:56:10 crc kubenswrapper[4929]: I1002 12:56:10.226184 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69eb84ff-3ff3-4b30-955f-a5570beb8c31" path="/var/lib/kubelet/pods/69eb84ff-3ff3-4b30-955f-a5570beb8c31/volumes" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.002690 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.005686 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.011218 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.011383 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.021886 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.091614 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.091674 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbzd\" (UniqueName: \"kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.091733 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.092004 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.092056 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.092141 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.092170 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194029 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194079 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194114 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194133 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194220 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194240 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbzd\" (UniqueName: \"kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.194268 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.195094 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.195415 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.200388 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.201877 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.203265 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.210258 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.212665 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbzd\" (UniqueName: \"kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd\") pod \"ceilometer-0\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.325822 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.698181 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.704639 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 12:56:11 crc kubenswrapper[4929]: I1002 12:56:11.819874 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:12 crc kubenswrapper[4929]: I1002 12:56:12.462024 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerStarted","Data":"d484f489f6068396fa9c0649dd47ffb7cf151ea50f9d455247f8a56ad1030d7f"} Oct 02 12:56:12 crc kubenswrapper[4929]: I1002 12:56:12.463579 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 12:56:13 crc kubenswrapper[4929]: I1002 12:56:13.480223 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerStarted","Data":"5bc991da5ea2d943c40749f81f60af209eefa027a4a6e0f3ecc1c5a71fa4aa15"} Oct 02 12:56:13 crc kubenswrapper[4929]: I1002 12:56:13.480696 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerStarted","Data":"62d3cba6685771dd692d0b9d744f60c2fc802aaf61d0f715475506e100ca3de9"} Oct 02 12:56:14 crc kubenswrapper[4929]: I1002 12:56:14.490763 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerStarted","Data":"d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c"} Oct 02 12:56:16 crc kubenswrapper[4929]: I1002 12:56:16.512934 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerStarted","Data":"186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf"} Oct 02 12:56:16 crc kubenswrapper[4929]: I1002 12:56:16.513429 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:56:16 crc kubenswrapper[4929]: I1002 12:56:16.530933 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.014023831 podStartE2EDuration="6.530911805s" podCreationTimestamp="2025-10-02 12:56:10 +0000 UTC" firstStartedPulling="2025-10-02 12:56:11.825156814 +0000 UTC m=+6372.375523178" lastFinishedPulling="2025-10-02 12:56:15.342044788 +0000 UTC m=+6375.892411152" observedRunningTime="2025-10-02 12:56:16.530804682 +0000 UTC m=+6377.081171046" watchObservedRunningTime="2025-10-02 12:56:16.530911805 +0000 UTC m=+6377.081278179" Oct 02 12:56:17 crc kubenswrapper[4929]: I1002 12:56:17.030124 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-bgxnp"] Oct 02 12:56:17 crc kubenswrapper[4929]: I1002 12:56:17.040305 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-bgxnp"] Oct 02 12:56:18 crc kubenswrapper[4929]: I1002 12:56:18.173913 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62690ece-7ef3-4ada-9ff4-9ed1d858fea6" path="/var/lib/kubelet/pods/62690ece-7ef3-4ada-9ff4-9ed1d858fea6/volumes" Oct 02 12:56:19 crc kubenswrapper[4929]: I1002 12:56:19.874819 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-4brnk"] Oct 02 12:56:19 crc kubenswrapper[4929]: I1002 12:56:19.876588 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:19 crc kubenswrapper[4929]: I1002 12:56:19.886053 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4brnk"] Oct 02 12:56:19 crc kubenswrapper[4929]: I1002 12:56:19.986754 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcs8m\" (UniqueName: \"kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m\") pod \"aodh-db-create-4brnk\" (UID: \"c39c7e50-59e1-4392-a742-3c11054e25d8\") " pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:20 crc kubenswrapper[4929]: I1002 12:56:20.089358 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcs8m\" (UniqueName: \"kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m\") pod \"aodh-db-create-4brnk\" (UID: \"c39c7e50-59e1-4392-a742-3c11054e25d8\") " pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:20 crc kubenswrapper[4929]: I1002 12:56:20.109091 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcs8m\" (UniqueName: \"kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m\") pod \"aodh-db-create-4brnk\" (UID: \"c39c7e50-59e1-4392-a742-3c11054e25d8\") " pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:20 crc kubenswrapper[4929]: I1002 12:56:20.193647 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:20 crc kubenswrapper[4929]: I1002 12:56:20.774240 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4brnk"] Oct 02 12:56:21 crc kubenswrapper[4929]: I1002 12:56:21.563107 4929 generic.go:334] "Generic (PLEG): container finished" podID="c39c7e50-59e1-4392-a742-3c11054e25d8" containerID="0bede43362ffd71d0457181b201c19d8c79fba4f503067bfb97cc75be4c5fbce" exitCode=0 Oct 02 12:56:21 crc kubenswrapper[4929]: I1002 12:56:21.563270 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4brnk" event={"ID":"c39c7e50-59e1-4392-a742-3c11054e25d8","Type":"ContainerDied","Data":"0bede43362ffd71d0457181b201c19d8c79fba4f503067bfb97cc75be4c5fbce"} Oct 02 12:56:21 crc kubenswrapper[4929]: I1002 12:56:21.563492 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4brnk" event={"ID":"c39c7e50-59e1-4392-a742-3c11054e25d8","Type":"ContainerStarted","Data":"ed5d3030a323b3b471757450e236e6caff30f619215b698baed57e634df8ec13"} Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.112589 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.161268 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcs8m\" (UniqueName: \"kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m\") pod \"c39c7e50-59e1-4392-a742-3c11054e25d8\" (UID: \"c39c7e50-59e1-4392-a742-3c11054e25d8\") " Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.167344 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m" (OuterVolumeSpecName: "kube-api-access-fcs8m") pod "c39c7e50-59e1-4392-a742-3c11054e25d8" (UID: "c39c7e50-59e1-4392-a742-3c11054e25d8"). InnerVolumeSpecName "kube-api-access-fcs8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.263709 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcs8m\" (UniqueName: \"kubernetes.io/projected/c39c7e50-59e1-4392-a742-3c11054e25d8-kube-api-access-fcs8m\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.583818 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4brnk" event={"ID":"c39c7e50-59e1-4392-a742-3c11054e25d8","Type":"ContainerDied","Data":"ed5d3030a323b3b471757450e236e6caff30f619215b698baed57e634df8ec13"} Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.584164 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed5d3030a323b3b471757450e236e6caff30f619215b698baed57e634df8ec13" Oct 02 12:56:23 crc kubenswrapper[4929]: I1002 12:56:23.584216 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4brnk" Oct 02 12:56:26 crc kubenswrapper[4929]: I1002 12:56:26.236442 4929 scope.go:117] "RemoveContainer" containerID="f8b188f9fcc84dee299361d4f076bc6e00809e83f10a83f783610cca84f2b0c6" Oct 02 12:56:26 crc kubenswrapper[4929]: I1002 12:56:26.262678 4929 scope.go:117] "RemoveContainer" containerID="f4d119c55679d5eb748d901a878c0c350f47bed375a2a3f5be759186bfdcbf15" Oct 02 12:56:26 crc kubenswrapper[4929]: I1002 12:56:26.313247 4929 scope.go:117] "RemoveContainer" containerID="6f907189e14b13cde3faa31d2b31b248f1ebf717fbd175e8f6f3ef51a539c489" Oct 02 12:56:26 crc kubenswrapper[4929]: I1002 12:56:26.354679 4929 scope.go:117] "RemoveContainer" containerID="96d9f03215cf48e90bc7d7889bea03f8a11e849239561d32e5d59b468bf6b35e" Oct 02 12:56:26 crc kubenswrapper[4929]: I1002 12:56:26.406936 4929 scope.go:117] "RemoveContainer" containerID="dd1eef42e5188ebdcf01a6cce1eaaf99a94695c98863bc7c5bcff1ea5492bbc3" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.901393 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-7a7b-account-create-r4q7k"] Oct 02 12:56:29 crc kubenswrapper[4929]: E1002 12:56:29.902288 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39c7e50-59e1-4392-a742-3c11054e25d8" containerName="mariadb-database-create" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.902301 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39c7e50-59e1-4392-a742-3c11054e25d8" containerName="mariadb-database-create" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.902532 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39c7e50-59e1-4392-a742-3c11054e25d8" containerName="mariadb-database-create" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.903258 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.905388 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 02 12:56:29 crc kubenswrapper[4929]: I1002 12:56:29.922521 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7a7b-account-create-r4q7k"] Oct 02 12:56:30 crc kubenswrapper[4929]: I1002 12:56:30.035836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbsz\" (UniqueName: \"kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz\") pod \"aodh-7a7b-account-create-r4q7k\" (UID: \"bb557e3e-048d-4b3e-bcf9-aa827b11e02d\") " pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:30 crc kubenswrapper[4929]: I1002 12:56:30.137346 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbsz\" (UniqueName: \"kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz\") pod \"aodh-7a7b-account-create-r4q7k\" (UID: \"bb557e3e-048d-4b3e-bcf9-aa827b11e02d\") " pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:30 crc kubenswrapper[4929]: I1002 12:56:30.154928 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbsz\" (UniqueName: \"kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz\") pod \"aodh-7a7b-account-create-r4q7k\" (UID: \"bb557e3e-048d-4b3e-bcf9-aa827b11e02d\") " pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:30 crc kubenswrapper[4929]: I1002 12:56:30.258437 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:30 crc kubenswrapper[4929]: I1002 12:56:30.856304 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-7a7b-account-create-r4q7k"] Oct 02 12:56:31 crc kubenswrapper[4929]: I1002 12:56:31.655832 4929 generic.go:334] "Generic (PLEG): container finished" podID="bb557e3e-048d-4b3e-bcf9-aa827b11e02d" containerID="a389bdc5843f4c36061cae2c2fa34c7feaae23cb830cc360f5465e2d6a4fa66f" exitCode=0 Oct 02 12:56:31 crc kubenswrapper[4929]: I1002 12:56:31.655879 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7a7b-account-create-r4q7k" event={"ID":"bb557e3e-048d-4b3e-bcf9-aa827b11e02d","Type":"ContainerDied","Data":"a389bdc5843f4c36061cae2c2fa34c7feaae23cb830cc360f5465e2d6a4fa66f"} Oct 02 12:56:31 crc kubenswrapper[4929]: I1002 12:56:31.655907 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7a7b-account-create-r4q7k" event={"ID":"bb557e3e-048d-4b3e-bcf9-aa827b11e02d","Type":"ContainerStarted","Data":"5a67b91021bfee94c2cb9628cb34edbc75ce77b1f986f59ecca5e25cc61a51f2"} Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.078235 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.091306 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbsz\" (UniqueName: \"kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz\") pod \"bb557e3e-048d-4b3e-bcf9-aa827b11e02d\" (UID: \"bb557e3e-048d-4b3e-bcf9-aa827b11e02d\") " Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.097273 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz" (OuterVolumeSpecName: "kube-api-access-9gbsz") pod "bb557e3e-048d-4b3e-bcf9-aa827b11e02d" (UID: "bb557e3e-048d-4b3e-bcf9-aa827b11e02d"). InnerVolumeSpecName "kube-api-access-9gbsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.195008 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbsz\" (UniqueName: \"kubernetes.io/projected/bb557e3e-048d-4b3e-bcf9-aa827b11e02d-kube-api-access-9gbsz\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.671903 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-7a7b-account-create-r4q7k" event={"ID":"bb557e3e-048d-4b3e-bcf9-aa827b11e02d","Type":"ContainerDied","Data":"5a67b91021bfee94c2cb9628cb34edbc75ce77b1f986f59ecca5e25cc61a51f2"} Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.672181 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a67b91021bfee94c2cb9628cb34edbc75ce77b1f986f59ecca5e25cc61a51f2" Oct 02 12:56:33 crc kubenswrapper[4929]: I1002 12:56:33.671948 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-7a7b-account-create-r4q7k" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.439941 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-v26ms"] Oct 02 12:56:35 crc kubenswrapper[4929]: E1002 12:56:35.440750 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb557e3e-048d-4b3e-bcf9-aa827b11e02d" containerName="mariadb-account-create" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.440763 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb557e3e-048d-4b3e-bcf9-aa827b11e02d" containerName="mariadb-account-create" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.440972 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb557e3e-048d-4b3e-bcf9-aa827b11e02d" containerName="mariadb-account-create" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.441690 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.444601 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-w7gg8" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.445454 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.446446 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.460481 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-v26ms"] Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.539556 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.539610 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhbz\" (UniqueName: \"kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.539655 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.539759 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.641219 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.641581 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhbz\" (UniqueName: \"kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.641617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.641671 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.647594 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.660210 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.660404 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.660602 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhbz\" (UniqueName: \"kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz\") pod \"aodh-db-sync-v26ms\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:35 crc kubenswrapper[4929]: I1002 12:56:35.763764 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:36 crc kubenswrapper[4929]: I1002 12:56:36.555174 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-v26ms"] Oct 02 12:56:36 crc kubenswrapper[4929]: I1002 12:56:36.704233 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-v26ms" event={"ID":"95e0d329-c0ec-4e95-a696-ae33e2eda9dc","Type":"ContainerStarted","Data":"72460862234d5b29d16d6a863e188350b2626c089778905df94da2228e371101"} Oct 02 12:56:41 crc kubenswrapper[4929]: I1002 12:56:41.334015 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:56:42 crc kubenswrapper[4929]: I1002 12:56:42.767473 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-v26ms" event={"ID":"95e0d329-c0ec-4e95-a696-ae33e2eda9dc","Type":"ContainerStarted","Data":"f81535a71e3507564f7bdf50047ede6bd67286802813789207cff50b9155b61c"} Oct 02 12:56:42 crc kubenswrapper[4929]: I1002 12:56:42.781398 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-v26ms" podStartSLOduration=2.8657385829999997 podStartE2EDuration="7.781380627s" podCreationTimestamp="2025-10-02 12:56:35 +0000 UTC" firstStartedPulling="2025-10-02 12:56:36.57967835 +0000 UTC m=+6397.130044714" lastFinishedPulling="2025-10-02 12:56:41.495320394 +0000 UTC m=+6402.045686758" observedRunningTime="2025-10-02 12:56:42.780220573 +0000 UTC m=+6403.330586937" watchObservedRunningTime="2025-10-02 12:56:42.781380627 +0000 UTC m=+6403.331746991" Oct 02 12:56:44 crc kubenswrapper[4929]: I1002 12:56:44.788884 4929 generic.go:334] "Generic (PLEG): container finished" podID="95e0d329-c0ec-4e95-a696-ae33e2eda9dc" containerID="f81535a71e3507564f7bdf50047ede6bd67286802813789207cff50b9155b61c" exitCode=0 Oct 02 12:56:44 crc kubenswrapper[4929]: I1002 12:56:44.788981 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-v26ms" event={"ID":"95e0d329-c0ec-4e95-a696-ae33e2eda9dc","Type":"ContainerDied","Data":"f81535a71e3507564f7bdf50047ede6bd67286802813789207cff50b9155b61c"} Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.218773 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.376100 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mhbz\" (UniqueName: \"kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz\") pod \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.376293 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts\") pod \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.376461 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle\") pod \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.376531 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data\") pod \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\" (UID: \"95e0d329-c0ec-4e95-a696-ae33e2eda9dc\") " Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.382414 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz" (OuterVolumeSpecName: "kube-api-access-6mhbz") pod "95e0d329-c0ec-4e95-a696-ae33e2eda9dc" (UID: "95e0d329-c0ec-4e95-a696-ae33e2eda9dc"). InnerVolumeSpecName "kube-api-access-6mhbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.382850 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts" (OuterVolumeSpecName: "scripts") pod "95e0d329-c0ec-4e95-a696-ae33e2eda9dc" (UID: "95e0d329-c0ec-4e95-a696-ae33e2eda9dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.406525 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95e0d329-c0ec-4e95-a696-ae33e2eda9dc" (UID: "95e0d329-c0ec-4e95-a696-ae33e2eda9dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.416694 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data" (OuterVolumeSpecName: "config-data") pod "95e0d329-c0ec-4e95-a696-ae33e2eda9dc" (UID: "95e0d329-c0ec-4e95-a696-ae33e2eda9dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.478695 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.478736 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.478748 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mhbz\" (UniqueName: \"kubernetes.io/projected/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-kube-api-access-6mhbz\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.478766 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95e0d329-c0ec-4e95-a696-ae33e2eda9dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.809547 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-v26ms" event={"ID":"95e0d329-c0ec-4e95-a696-ae33e2eda9dc","Type":"ContainerDied","Data":"72460862234d5b29d16d6a863e188350b2626c089778905df94da2228e371101"} Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.809601 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72460862234d5b29d16d6a863e188350b2626c089778905df94da2228e371101" Oct 02 12:56:46 crc kubenswrapper[4929]: I1002 12:56:46.809650 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-v26ms" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.546373 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 02 12:56:50 crc kubenswrapper[4929]: E1002 12:56:50.547286 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e0d329-c0ec-4e95-a696-ae33e2eda9dc" containerName="aodh-db-sync" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.547304 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e0d329-c0ec-4e95-a696-ae33e2eda9dc" containerName="aodh-db-sync" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.547563 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e0d329-c0ec-4e95-a696-ae33e2eda9dc" containerName="aodh-db-sync" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.549715 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.553803 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-w7gg8" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.553989 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.554089 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.562919 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.680860 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-combined-ca-bundle\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.681069 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-scripts\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.681114 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qzc\" (UniqueName: \"kubernetes.io/projected/efd3c5dd-604f-436d-8c94-abec6c600e77-kube-api-access-98qzc\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.681163 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-config-data\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.782746 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-scripts\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.782826 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98qzc\" (UniqueName: \"kubernetes.io/projected/efd3c5dd-604f-436d-8c94-abec6c600e77-kube-api-access-98qzc\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.782883 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-config-data\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.782913 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-combined-ca-bundle\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.789429 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-combined-ca-bundle\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.811649 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-config-data\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.811864 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd3c5dd-604f-436d-8c94-abec6c600e77-scripts\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.815505 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qzc\" (UniqueName: \"kubernetes.io/projected/efd3c5dd-604f-436d-8c94-abec6c600e77-kube-api-access-98qzc\") pod \"aodh-0\" (UID: \"efd3c5dd-604f-436d-8c94-abec6c600e77\") " pod="openstack/aodh-0" Oct 02 12:56:50 crc kubenswrapper[4929]: I1002 12:56:50.894361 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 12:56:51 crc kubenswrapper[4929]: I1002 12:56:51.462704 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 12:56:51 crc kubenswrapper[4929]: I1002 12:56:51.868120 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"efd3c5dd-604f-436d-8c94-abec6c600e77","Type":"ContainerStarted","Data":"d9d5cc825fbd272042356eba2dffdc24df57b74bf44de43c98bafb2ebf3ea947"} Oct 02 12:56:52 crc kubenswrapper[4929]: I1002 12:56:52.880278 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"efd3c5dd-604f-436d-8c94-abec6c600e77","Type":"ContainerStarted","Data":"d1aaa88c16d74e7a3cfe78342b7dfd68a4a0775d6d052fb803cce460265cb8cc"} Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.035910 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.036599 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-central-agent" containerID="cri-o://62d3cba6685771dd692d0b9d744f60c2fc802aaf61d0f715475506e100ca3de9" gracePeriod=30 Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.037130 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="proxy-httpd" containerID="cri-o://186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf" gracePeriod=30 Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.037189 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="sg-core" containerID="cri-o://d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c" gracePeriod=30 Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.037235 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-notification-agent" containerID="cri-o://5bc991da5ea2d943c40749f81f60af209eefa027a4a6e0f3ecc1c5a71fa4aa15" gracePeriod=30 Oct 02 12:56:53 crc kubenswrapper[4929]: E1002 12:56:53.293172 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ec048f_ac83_4e45_814b_97a5bebb8c41.slice/crio-d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ec048f_ac83_4e45_814b_97a5bebb8c41.slice/crio-186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ec048f_ac83_4e45_814b_97a5bebb8c41.slice/crio-conmon-186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.895157 4929 generic.go:334] "Generic (PLEG): container finished" podID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerID="186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf" exitCode=0 Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.897134 4929 generic.go:334] "Generic (PLEG): container finished" podID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerID="d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c" exitCode=2 Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.895513 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerDied","Data":"186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf"} Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.897219 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerDied","Data":"d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c"} Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.897236 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerDied","Data":"62d3cba6685771dd692d0b9d744f60c2fc802aaf61d0f715475506e100ca3de9"} Oct 02 12:56:53 crc kubenswrapper[4929]: I1002 12:56:53.897184 4929 generic.go:334] "Generic (PLEG): container finished" podID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerID="62d3cba6685771dd692d0b9d744f60c2fc802aaf61d0f715475506e100ca3de9" exitCode=0 Oct 02 12:56:54 crc kubenswrapper[4929]: I1002 12:56:54.917392 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"efd3c5dd-604f-436d-8c94-abec6c600e77","Type":"ContainerStarted","Data":"4c2cba6745e8e98ff84d794f275f3ead835004ecaa3dc554fed0d373453d416c"} Oct 02 12:56:55 crc kubenswrapper[4929]: I1002 12:56:55.898251 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:56:55 crc kubenswrapper[4929]: I1002 12:56:55.930212 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"efd3c5dd-604f-436d-8c94-abec6c600e77","Type":"ContainerStarted","Data":"93747014cb58a4f1b099d7475e84a94a8d8d7f53b21cf26b891dbfc135ae1338"} Oct 02 12:56:56 crc kubenswrapper[4929]: I1002 12:56:56.941125 4929 generic.go:334] "Generic (PLEG): container finished" podID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerID="5bc991da5ea2d943c40749f81f60af209eefa027a4a6e0f3ecc1c5a71fa4aa15" exitCode=0 Oct 02 12:56:56 crc kubenswrapper[4929]: I1002 12:56:56.941192 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerDied","Data":"5bc991da5ea2d943c40749f81f60af209eefa027a4a6e0f3ecc1c5a71fa4aa15"} Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.179851 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334251 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334470 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334520 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334587 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334652 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334685 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnbzd\" (UniqueName: \"kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.334715 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts\") pod \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\" (UID: \"f7ec048f-ac83-4e45-814b-97a5bebb8c41\") " Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.335438 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.335455 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.340240 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts" (OuterVolumeSpecName: "scripts") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.340338 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd" (OuterVolumeSpecName: "kube-api-access-hnbzd") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "kube-api-access-hnbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.364272 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.414235 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.431892 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data" (OuterVolumeSpecName: "config-data") pod "f7ec048f-ac83-4e45-814b-97a5bebb8c41" (UID: "f7ec048f-ac83-4e45-814b-97a5bebb8c41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437325 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437356 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437368 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437376 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7ec048f-ac83-4e45-814b-97a5bebb8c41-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437385 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnbzd\" (UniqueName: \"kubernetes.io/projected/f7ec048f-ac83-4e45-814b-97a5bebb8c41-kube-api-access-hnbzd\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437395 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.437403 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7ec048f-ac83-4e45-814b-97a5bebb8c41-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.977940 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"efd3c5dd-604f-436d-8c94-abec6c600e77","Type":"ContainerStarted","Data":"fb713e9f1c56d2a6cde9cadcd63f0af3cae3de16a953b21857f463ad5dc6c1ea"} Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.982816 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7ec048f-ac83-4e45-814b-97a5bebb8c41","Type":"ContainerDied","Data":"d484f489f6068396fa9c0649dd47ffb7cf151ea50f9d455247f8a56ad1030d7f"} Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.982926 4929 scope.go:117] "RemoveContainer" containerID="186e6f3472429ab8cb0c885268b96ce50cd132cc1c27d1e88b3452cb2cbfc1bf" Oct 02 12:56:57 crc kubenswrapper[4929]: I1002 12:56:57.982945 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.016084 4929 scope.go:117] "RemoveContainer" containerID="d3216b1413fbc69a28db808476c1283800858f4bf48f04df660123a86997793c" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.016730 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.666035035 podStartE2EDuration="8.016713582s" podCreationTimestamp="2025-10-02 12:56:50 +0000 UTC" firstStartedPulling="2025-10-02 12:56:51.472873704 +0000 UTC m=+6412.023240068" lastFinishedPulling="2025-10-02 12:56:56.823552251 +0000 UTC m=+6417.373918615" observedRunningTime="2025-10-02 12:56:57.997043603 +0000 UTC m=+6418.547409967" watchObservedRunningTime="2025-10-02 12:56:58.016713582 +0000 UTC m=+6418.567079946" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.042097 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.055006 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.070887 4929 scope.go:117] "RemoveContainer" containerID="5bc991da5ea2d943c40749f81f60af209eefa027a4a6e0f3ecc1c5a71fa4aa15" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.070997 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:58 crc kubenswrapper[4929]: E1002 12:56:58.071480 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-notification-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071506 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-notification-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: E1002 12:56:58.071545 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-central-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071551 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-central-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: E1002 12:56:58.071574 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="proxy-httpd" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071581 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="proxy-httpd" Oct 02 12:56:58 crc kubenswrapper[4929]: E1002 12:56:58.071592 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="sg-core" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071598 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="sg-core" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071780 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-notification-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071791 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="proxy-httpd" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071816 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="ceilometer-central-agent" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.071825 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" containerName="sg-core" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.074241 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.077176 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.081901 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.082031 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.118402 4929 scope.go:117] "RemoveContainer" containerID="62d3cba6685771dd692d0b9d744f60c2fc802aaf61d0f715475506e100ca3de9" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.168495 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ec048f-ac83-4e45-814b-97a5bebb8c41" path="/var/lib/kubelet/pods/f7ec048f-ac83-4e45-814b-97a5bebb8c41/volumes" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170306 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170460 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170488 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170667 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170795 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqc8c\" (UniqueName: \"kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170867 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.170925 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.273564 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.273834 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.273863 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.273935 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.274040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.274071 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqc8c\" (UniqueName: \"kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.274105 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.274154 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.274548 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.278180 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.278207 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.278180 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.291247 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.294233 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqc8c\" (UniqueName: \"kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c\") pod \"ceilometer-0\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.413984 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.913439 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:56:58 crc kubenswrapper[4929]: I1002 12:56:58.994802 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerStarted","Data":"c0e2e2f229094afc8639093623397f427a87a11adc22f4a68978859096908a21"} Oct 02 12:57:00 crc kubenswrapper[4929]: I1002 12:57:00.010293 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerStarted","Data":"01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7"} Oct 02 12:57:01 crc kubenswrapper[4929]: I1002 12:57:01.046337 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerStarted","Data":"10b439a0a645cb081558cef9cfb8bb121961b27f2f4d130bca081757b9486554"} Oct 02 12:57:03 crc kubenswrapper[4929]: I1002 12:57:03.070262 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerStarted","Data":"193d592b7ab16a68b9bf905ffb4243f6382e8cb51b86b708f2bb865d0b35c37c"} Oct 02 12:57:04 crc kubenswrapper[4929]: I1002 12:57:04.083311 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerStarted","Data":"8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b"} Oct 02 12:57:04 crc kubenswrapper[4929]: I1002 12:57:04.083839 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:57:04 crc kubenswrapper[4929]: I1002 12:57:04.110417 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.445528157 podStartE2EDuration="6.110390137s" podCreationTimestamp="2025-10-02 12:56:58 +0000 UTC" firstStartedPulling="2025-10-02 12:56:58.92172456 +0000 UTC m=+6419.472090924" lastFinishedPulling="2025-10-02 12:57:03.58658654 +0000 UTC m=+6424.136952904" observedRunningTime="2025-10-02 12:57:04.103344384 +0000 UTC m=+6424.653710758" watchObservedRunningTime="2025-10-02 12:57:04.110390137 +0000 UTC m=+6424.660756521" Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.292525 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-frhnh"] Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.294575 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frhnh" Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.301711 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-frhnh"] Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.432682 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84gz\" (UniqueName: \"kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz\") pod \"manila-db-create-frhnh\" (UID: \"3fd09510-c570-4fdc-afe4-206212f65c8e\") " pod="openstack/manila-db-create-frhnh" Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.534795 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84gz\" (UniqueName: \"kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz\") pod \"manila-db-create-frhnh\" (UID: \"3fd09510-c570-4fdc-afe4-206212f65c8e\") " pod="openstack/manila-db-create-frhnh" Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.555229 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84gz\" (UniqueName: \"kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz\") pod \"manila-db-create-frhnh\" (UID: \"3fd09510-c570-4fdc-afe4-206212f65c8e\") " pod="openstack/manila-db-create-frhnh" Oct 02 12:57:05 crc kubenswrapper[4929]: I1002 12:57:05.620314 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frhnh" Oct 02 12:57:06 crc kubenswrapper[4929]: I1002 12:57:06.225774 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-frhnh"] Oct 02 12:57:07 crc kubenswrapper[4929]: I1002 12:57:07.113791 4929 generic.go:334] "Generic (PLEG): container finished" podID="3fd09510-c570-4fdc-afe4-206212f65c8e" containerID="dd215e6d27b257cea282c8b65069f1a8e81d19255c6137fe2f20a101c7929878" exitCode=0 Oct 02 12:57:07 crc kubenswrapper[4929]: I1002 12:57:07.113871 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frhnh" event={"ID":"3fd09510-c570-4fdc-afe4-206212f65c8e","Type":"ContainerDied","Data":"dd215e6d27b257cea282c8b65069f1a8e81d19255c6137fe2f20a101c7929878"} Oct 02 12:57:07 crc kubenswrapper[4929]: I1002 12:57:07.114426 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frhnh" event={"ID":"3fd09510-c570-4fdc-afe4-206212f65c8e","Type":"ContainerStarted","Data":"24e6825021e7666c8338f2fea9aaf1a4beb2ca8bbd496171ea1d99d6fe96fe89"} Oct 02 12:57:08 crc kubenswrapper[4929]: I1002 12:57:08.598220 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frhnh" Oct 02 12:57:08 crc kubenswrapper[4929]: I1002 12:57:08.701729 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v84gz\" (UniqueName: \"kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz\") pod \"3fd09510-c570-4fdc-afe4-206212f65c8e\" (UID: \"3fd09510-c570-4fdc-afe4-206212f65c8e\") " Oct 02 12:57:08 crc kubenswrapper[4929]: I1002 12:57:08.707410 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz" (OuterVolumeSpecName: "kube-api-access-v84gz") pod "3fd09510-c570-4fdc-afe4-206212f65c8e" (UID: "3fd09510-c570-4fdc-afe4-206212f65c8e"). InnerVolumeSpecName "kube-api-access-v84gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:08 crc kubenswrapper[4929]: I1002 12:57:08.804495 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v84gz\" (UniqueName: \"kubernetes.io/projected/3fd09510-c570-4fdc-afe4-206212f65c8e-kube-api-access-v84gz\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:09 crc kubenswrapper[4929]: I1002 12:57:09.136660 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-frhnh" event={"ID":"3fd09510-c570-4fdc-afe4-206212f65c8e","Type":"ContainerDied","Data":"24e6825021e7666c8338f2fea9aaf1a4beb2ca8bbd496171ea1d99d6fe96fe89"} Oct 02 12:57:09 crc kubenswrapper[4929]: I1002 12:57:09.136713 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e6825021e7666c8338f2fea9aaf1a4beb2ca8bbd496171ea1d99d6fe96fe89" Oct 02 12:57:09 crc kubenswrapper[4929]: I1002 12:57:09.136726 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-frhnh" Oct 02 12:57:14 crc kubenswrapper[4929]: I1002 12:57:14.736728 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:57:14 crc kubenswrapper[4929]: I1002 12:57:14.737723 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.344651 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-5b16-account-create-c2fzx"] Oct 02 12:57:15 crc kubenswrapper[4929]: E1002 12:57:15.345139 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd09510-c570-4fdc-afe4-206212f65c8e" containerName="mariadb-database-create" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.345153 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd09510-c570-4fdc-afe4-206212f65c8e" containerName="mariadb-database-create" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.345352 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd09510-c570-4fdc-afe4-206212f65c8e" containerName="mariadb-database-create" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.346073 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.349099 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.354630 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-5b16-account-create-c2fzx"] Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.430767 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bjp\" (UniqueName: \"kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp\") pod \"manila-5b16-account-create-c2fzx\" (UID: \"565fce37-3c88-4527-aa53-688c6e3ab5c9\") " pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.534301 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bjp\" (UniqueName: \"kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp\") pod \"manila-5b16-account-create-c2fzx\" (UID: \"565fce37-3c88-4527-aa53-688c6e3ab5c9\") " pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.560257 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bjp\" (UniqueName: \"kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp\") pod \"manila-5b16-account-create-c2fzx\" (UID: \"565fce37-3c88-4527-aa53-688c6e3ab5c9\") " pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:15 crc kubenswrapper[4929]: I1002 12:57:15.665310 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:16 crc kubenswrapper[4929]: I1002 12:57:16.173081 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-5b16-account-create-c2fzx"] Oct 02 12:57:16 crc kubenswrapper[4929]: I1002 12:57:16.210157 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5b16-account-create-c2fzx" event={"ID":"565fce37-3c88-4527-aa53-688c6e3ab5c9","Type":"ContainerStarted","Data":"f747ea0d1cd2bbb6506834cb6bdcd77849c4ff71a1f8af0402cb705b19508183"} Oct 02 12:57:17 crc kubenswrapper[4929]: I1002 12:57:17.220322 4929 generic.go:334] "Generic (PLEG): container finished" podID="565fce37-3c88-4527-aa53-688c6e3ab5c9" containerID="f55eed708757f98f617dff476d4efaf1488663c370ff34c4cfe7d999e425c065" exitCode=0 Oct 02 12:57:17 crc kubenswrapper[4929]: I1002 12:57:17.220404 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5b16-account-create-c2fzx" event={"ID":"565fce37-3c88-4527-aa53-688c6e3ab5c9","Type":"ContainerDied","Data":"f55eed708757f98f617dff476d4efaf1488663c370ff34c4cfe7d999e425c065"} Oct 02 12:57:18 crc kubenswrapper[4929]: I1002 12:57:18.702427 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:18 crc kubenswrapper[4929]: I1002 12:57:18.808438 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46bjp\" (UniqueName: \"kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp\") pod \"565fce37-3c88-4527-aa53-688c6e3ab5c9\" (UID: \"565fce37-3c88-4527-aa53-688c6e3ab5c9\") " Oct 02 12:57:18 crc kubenswrapper[4929]: I1002 12:57:18.820248 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp" (OuterVolumeSpecName: "kube-api-access-46bjp") pod "565fce37-3c88-4527-aa53-688c6e3ab5c9" (UID: "565fce37-3c88-4527-aa53-688c6e3ab5c9"). InnerVolumeSpecName "kube-api-access-46bjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:18 crc kubenswrapper[4929]: I1002 12:57:18.913379 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46bjp\" (UniqueName: \"kubernetes.io/projected/565fce37-3c88-4527-aa53-688c6e3ab5c9-kube-api-access-46bjp\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:19 crc kubenswrapper[4929]: I1002 12:57:19.247527 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5b16-account-create-c2fzx" event={"ID":"565fce37-3c88-4527-aa53-688c6e3ab5c9","Type":"ContainerDied","Data":"f747ea0d1cd2bbb6506834cb6bdcd77849c4ff71a1f8af0402cb705b19508183"} Oct 02 12:57:19 crc kubenswrapper[4929]: I1002 12:57:19.247569 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f747ea0d1cd2bbb6506834cb6bdcd77849c4ff71a1f8af0402cb705b19508183" Oct 02 12:57:19 crc kubenswrapper[4929]: I1002 12:57:19.247621 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5b16-account-create-c2fzx" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.594913 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-q5nx8"] Oct 02 12:57:20 crc kubenswrapper[4929]: E1002 12:57:20.595395 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565fce37-3c88-4527-aa53-688c6e3ab5c9" containerName="mariadb-account-create" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.595409 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="565fce37-3c88-4527-aa53-688c6e3ab5c9" containerName="mariadb-account-create" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.595632 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="565fce37-3c88-4527-aa53-688c6e3ab5c9" containerName="mariadb-account-create" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.596416 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.598941 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-d7zpp" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.599895 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.606529 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-q5nx8"] Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.651047 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.651116 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.651219 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.651265 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqzm\" (UniqueName: \"kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.753293 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.753393 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.753517 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.753564 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqzm\" (UniqueName: \"kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.759120 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.759356 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.766793 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.771309 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqzm\" (UniqueName: \"kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm\") pod \"manila-db-sync-q5nx8\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:20 crc kubenswrapper[4929]: I1002 12:57:20.931327 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:21 crc kubenswrapper[4929]: I1002 12:57:21.849659 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-q5nx8"] Oct 02 12:57:22 crc kubenswrapper[4929]: I1002 12:57:22.277461 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-q5nx8" event={"ID":"e165862b-1cb5-4cfb-934f-33abd112b63f","Type":"ContainerStarted","Data":"62202d746b0916bc353768de707314aed25c633c7319c1224ed58f5c19723c3d"} Oct 02 12:57:27 crc kubenswrapper[4929]: I1002 12:57:27.327672 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-q5nx8" event={"ID":"e165862b-1cb5-4cfb-934f-33abd112b63f","Type":"ContainerStarted","Data":"8492690d54bd882581ed4c6871094cfa65f767d4c84a2e2197a983dff379cafe"} Oct 02 12:57:27 crc kubenswrapper[4929]: I1002 12:57:27.351071 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-q5nx8" podStartSLOduration=3.322961196 podStartE2EDuration="7.351052114s" podCreationTimestamp="2025-10-02 12:57:20 +0000 UTC" firstStartedPulling="2025-10-02 12:57:21.853635758 +0000 UTC m=+6442.404002122" lastFinishedPulling="2025-10-02 12:57:25.881726676 +0000 UTC m=+6446.432093040" observedRunningTime="2025-10-02 12:57:27.34401117 +0000 UTC m=+6447.894377534" watchObservedRunningTime="2025-10-02 12:57:27.351052114 +0000 UTC m=+6447.901418478" Oct 02 12:57:28 crc kubenswrapper[4929]: I1002 12:57:28.339044 4929 generic.go:334] "Generic (PLEG): container finished" podID="e165862b-1cb5-4cfb-934f-33abd112b63f" containerID="8492690d54bd882581ed4c6871094cfa65f767d4c84a2e2197a983dff379cafe" exitCode=0 Oct 02 12:57:28 crc kubenswrapper[4929]: I1002 12:57:28.339115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-q5nx8" event={"ID":"e165862b-1cb5-4cfb-934f-33abd112b63f","Type":"ContainerDied","Data":"8492690d54bd882581ed4c6871094cfa65f767d4c84a2e2197a983dff379cafe"} Oct 02 12:57:28 crc kubenswrapper[4929]: I1002 12:57:28.422576 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.860560 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.960090 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnqzm\" (UniqueName: \"kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm\") pod \"e165862b-1cb5-4cfb-934f-33abd112b63f\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.960182 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data\") pod \"e165862b-1cb5-4cfb-934f-33abd112b63f\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.960224 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle\") pod \"e165862b-1cb5-4cfb-934f-33abd112b63f\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.960283 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data\") pod \"e165862b-1cb5-4cfb-934f-33abd112b63f\" (UID: \"e165862b-1cb5-4cfb-934f-33abd112b63f\") " Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.967232 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm" (OuterVolumeSpecName: "kube-api-access-rnqzm") pod "e165862b-1cb5-4cfb-934f-33abd112b63f" (UID: "e165862b-1cb5-4cfb-934f-33abd112b63f"). InnerVolumeSpecName "kube-api-access-rnqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.978761 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "e165862b-1cb5-4cfb-934f-33abd112b63f" (UID: "e165862b-1cb5-4cfb-934f-33abd112b63f"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:29 crc kubenswrapper[4929]: I1002 12:57:29.980835 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data" (OuterVolumeSpecName: "config-data") pod "e165862b-1cb5-4cfb-934f-33abd112b63f" (UID: "e165862b-1cb5-4cfb-934f-33abd112b63f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.001559 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e165862b-1cb5-4cfb-934f-33abd112b63f" (UID: "e165862b-1cb5-4cfb-934f-33abd112b63f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.062243 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.062279 4929 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.062288 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnqzm\" (UniqueName: \"kubernetes.io/projected/e165862b-1cb5-4cfb-934f-33abd112b63f-kube-api-access-rnqzm\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.062299 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e165862b-1cb5-4cfb-934f-33abd112b63f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.379145 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-q5nx8" event={"ID":"e165862b-1cb5-4cfb-934f-33abd112b63f","Type":"ContainerDied","Data":"62202d746b0916bc353768de707314aed25c633c7319c1224ed58f5c19723c3d"} Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.379885 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-q5nx8" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.380845 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62202d746b0916bc353768de707314aed25c633c7319c1224ed58f5c19723c3d" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.791224 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:57:30 crc kubenswrapper[4929]: E1002 12:57:30.791719 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e165862b-1cb5-4cfb-934f-33abd112b63f" containerName="manila-db-sync" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.791736 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e165862b-1cb5-4cfb-934f-33abd112b63f" containerName="manila-db-sync" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.791976 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e165862b-1cb5-4cfb-934f-33abd112b63f" containerName="manila-db-sync" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.793189 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.798454 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.798609 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.802789 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-d7zpp" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.802965 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.828062 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.841057 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.843522 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.850477 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.888132 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893352 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-scripts\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893423 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-ceph\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893448 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893484 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6rbr\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-kube-api-access-n6rbr\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893518 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/067603a9-c42e-433c-ad6b-1522338b1041-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893539 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893562 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893643 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893671 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893709 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-scripts\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893776 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893800 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893824 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hb2p\" (UniqueName: \"kubernetes.io/projected/067603a9-c42e-433c-ad6b-1522338b1041-kube-api-access-6hb2p\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.893867 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.901060 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.903315 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.967729 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995202 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995253 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995296 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hb2p\" (UniqueName: \"kubernetes.io/projected/067603a9-c42e-433c-ad6b-1522338b1041-kube-api-access-6hb2p\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995340 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995364 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-scripts\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995392 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-ceph\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995411 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995438 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6rbr\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-kube-api-access-n6rbr\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995462 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995478 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/067603a9-c42e-433c-ad6b-1522338b1041-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995497 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995521 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995545 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995604 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995622 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995647 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995666 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-scripts\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995691 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzfhk\" (UniqueName: \"kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.995961 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/067603a9-c42e-433c-ad6b-1522338b1041-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.996275 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:30 crc kubenswrapper[4929]: I1002 12:57:30.999388 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2215b331-4644-4892-93cc-a99728b27cad-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.003760 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.005724 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.005907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.006781 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-scripts\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.007343 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.009671 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.014668 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/067603a9-c42e-433c-ad6b-1522338b1041-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.016586 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-ceph\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.019435 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hb2p\" (UniqueName: \"kubernetes.io/projected/067603a9-c42e-433c-ad6b-1522338b1041-kube-api-access-6hb2p\") pod \"manila-scheduler-0\" (UID: \"067603a9-c42e-433c-ad6b-1522338b1041\") " pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.030718 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2215b331-4644-4892-93cc-a99728b27cad-scripts\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.043235 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6rbr\" (UniqueName: \"kubernetes.io/projected/2215b331-4644-4892-93cc-a99728b27cad-kube-api-access-n6rbr\") pod \"manila-share-share1-0\" (UID: \"2215b331-4644-4892-93cc-a99728b27cad\") " pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.074155 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.077185 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.081870 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103575 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103632 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20702f9c-bac3-4da2-afc6-59ef9849429f-logs\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103668 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103691 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data-custom\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103722 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzfhk\" (UniqueName: \"kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103753 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20702f9c-bac3-4da2-afc6-59ef9849429f-etc-machine-id\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103779 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103799 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhz7z\" (UniqueName: \"kubernetes.io/projected/20702f9c-bac3-4da2-afc6-59ef9849429f-kube-api-access-lhz7z\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103815 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103836 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-scripts\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.103971 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.104005 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.107884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.108518 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.109183 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.110712 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.110856 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.121683 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.129090 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzfhk\" (UniqueName: \"kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk\") pod \"dnsmasq-dns-64fd8d7f5-h8q8n\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.189654 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206466 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20702f9c-bac3-4da2-afc6-59ef9849429f-logs\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206525 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data-custom\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206562 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20702f9c-bac3-4da2-afc6-59ef9849429f-etc-machine-id\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206602 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206623 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhz7z\" (UniqueName: \"kubernetes.io/projected/20702f9c-bac3-4da2-afc6-59ef9849429f-kube-api-access-lhz7z\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206644 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.206664 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-scripts\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.207151 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20702f9c-bac3-4da2-afc6-59ef9849429f-etc-machine-id\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.209115 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20702f9c-bac3-4da2-afc6-59ef9849429f-logs\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.212515 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.215333 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-config-data-custom\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.216430 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.216769 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20702f9c-bac3-4da2-afc6-59ef9849429f-scripts\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.235497 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhz7z\" (UniqueName: \"kubernetes.io/projected/20702f9c-bac3-4da2-afc6-59ef9849429f-kube-api-access-lhz7z\") pod \"manila-api-0\" (UID: \"20702f9c-bac3-4da2-afc6-59ef9849429f\") " pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.246315 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.281710 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.767489 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:57:31 crc kubenswrapper[4929]: I1002 12:57:31.871067 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.040090 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.125719 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.451715 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20702f9c-bac3-4da2-afc6-59ef9849429f","Type":"ContainerStarted","Data":"bd60894b99d6bf492c709baa2f779914ad36ae313c51dcc27e908359357ed87a"} Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.456821 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2215b331-4644-4892-93cc-a99728b27cad","Type":"ContainerStarted","Data":"fc1fb40681d2958356740602eec56bc2e2d1ac06e65734209c6dc3add4c21819"} Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.458409 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" event={"ID":"ad696f41-0dc4-4f9a-b099-c0255ebbe450","Type":"ContainerStarted","Data":"0cb365334b6d7e77672239ac6a11a12c874bb6203b65e762c38fcb65b55877a5"} Oct 02 12:57:32 crc kubenswrapper[4929]: I1002 12:57:32.459499 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"067603a9-c42e-433c-ad6b-1522338b1041","Type":"ContainerStarted","Data":"923ffb8aecddb2dc24ae82f90058d3dfc987ae4bc074d71203d8ad7ef5b85dea"} Oct 02 12:57:33 crc kubenswrapper[4929]: I1002 12:57:33.475367 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20702f9c-bac3-4da2-afc6-59ef9849429f","Type":"ContainerStarted","Data":"effd865db14ac8037cecc61cb17615b43c4df68eba93791a9a8ac4f2604214a0"} Oct 02 12:57:33 crc kubenswrapper[4929]: I1002 12:57:33.477441 4929 generic.go:334] "Generic (PLEG): container finished" podID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerID="0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1" exitCode=0 Oct 02 12:57:33 crc kubenswrapper[4929]: I1002 12:57:33.477497 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" event={"ID":"ad696f41-0dc4-4f9a-b099-c0255ebbe450","Type":"ContainerDied","Data":"0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1"} Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.495247 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"067603a9-c42e-433c-ad6b-1522338b1041","Type":"ContainerStarted","Data":"200010413a160f27db2d60576955eb01035a38fca4ed24dd38ed70d7d962e852"} Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.498999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20702f9c-bac3-4da2-afc6-59ef9849429f","Type":"ContainerStarted","Data":"2d1b38c97ddee666a0eb26700c3d04cdf1ced36267c39723c3f1236d9f80b200"} Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.499124 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.500928 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" event={"ID":"ad696f41-0dc4-4f9a-b099-c0255ebbe450","Type":"ContainerStarted","Data":"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288"} Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.501256 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.517399 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.51737569 podStartE2EDuration="3.51737569s" podCreationTimestamp="2025-10-02 12:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:57:34.513493638 +0000 UTC m=+6455.063860002" watchObservedRunningTime="2025-10-02 12:57:34.51737569 +0000 UTC m=+6455.067742054" Oct 02 12:57:34 crc kubenswrapper[4929]: I1002 12:57:34.556150 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" podStartSLOduration=4.556131789 podStartE2EDuration="4.556131789s" podCreationTimestamp="2025-10-02 12:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:57:34.539607552 +0000 UTC m=+6455.089973916" watchObservedRunningTime="2025-10-02 12:57:34.556131789 +0000 UTC m=+6455.106498153" Oct 02 12:57:35 crc kubenswrapper[4929]: I1002 12:57:35.514484 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"067603a9-c42e-433c-ad6b-1522338b1041","Type":"ContainerStarted","Data":"8b59d7f724c801af9058f72b3356dfabcb665579f958779c66197811e65fccba"} Oct 02 12:57:35 crc kubenswrapper[4929]: I1002 12:57:35.557767 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.160621817 podStartE2EDuration="5.557748468s" podCreationTimestamp="2025-10-02 12:57:30 +0000 UTC" firstStartedPulling="2025-10-02 12:57:31.786261161 +0000 UTC m=+6452.336627535" lastFinishedPulling="2025-10-02 12:57:33.183387822 +0000 UTC m=+6453.733754186" observedRunningTime="2025-10-02 12:57:35.550040675 +0000 UTC m=+6456.100407039" watchObservedRunningTime="2025-10-02 12:57:35.557748468 +0000 UTC m=+6456.108114852" Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.190354 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.248112 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.305072 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.305296 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="dnsmasq-dns" containerID="cri-o://e45b9445689c4fcf44fd21aee8563add036056f6a912455e33afbb42cbe11d08" gracePeriod=10 Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.587047 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2215b331-4644-4892-93cc-a99728b27cad","Type":"ContainerStarted","Data":"759bbd785adcde7a1574adf8201b2a500f51b1b50fa872ba752316c770f729e2"} Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.587382 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"2215b331-4644-4892-93cc-a99728b27cad","Type":"ContainerStarted","Data":"20e11ef49bf107765f2d8c1954a10e2df1c90ce56a97b02c18e06e425da95eb0"} Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.593734 4929 generic.go:334] "Generic (PLEG): container finished" podID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerID="e45b9445689c4fcf44fd21aee8563add036056f6a912455e33afbb42cbe11d08" exitCode=0 Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.593784 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" event={"ID":"a583de89-e9f5-48d5-b1d4-b0fb2e53b992","Type":"ContainerDied","Data":"e45b9445689c4fcf44fd21aee8563add036056f6a912455e33afbb42cbe11d08"} Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.609916 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.337178615 podStartE2EDuration="11.609898245s" podCreationTimestamp="2025-10-02 12:57:30 +0000 UTC" firstStartedPulling="2025-10-02 12:57:31.876318592 +0000 UTC m=+6452.426684966" lastFinishedPulling="2025-10-02 12:57:40.149038232 +0000 UTC m=+6460.699404596" observedRunningTime="2025-10-02 12:57:41.603902252 +0000 UTC m=+6462.154268616" watchObservedRunningTime="2025-10-02 12:57:41.609898245 +0000 UTC m=+6462.160264609" Oct 02 12:57:41 crc kubenswrapper[4929]: I1002 12:57:41.954637 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.076290 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config\") pod \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.076452 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7pzv\" (UniqueName: \"kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv\") pod \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.076518 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc\") pod \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.076548 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb\") pod \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.076614 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb\") pod \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\" (UID: \"a583de89-e9f5-48d5-b1d4-b0fb2e53b992\") " Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.082758 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv" (OuterVolumeSpecName: "kube-api-access-c7pzv") pod "a583de89-e9f5-48d5-b1d4-b0fb2e53b992" (UID: "a583de89-e9f5-48d5-b1d4-b0fb2e53b992"). InnerVolumeSpecName "kube-api-access-c7pzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.138465 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a583de89-e9f5-48d5-b1d4-b0fb2e53b992" (UID: "a583de89-e9f5-48d5-b1d4-b0fb2e53b992"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.138490 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a583de89-e9f5-48d5-b1d4-b0fb2e53b992" (UID: "a583de89-e9f5-48d5-b1d4-b0fb2e53b992"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.140561 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a583de89-e9f5-48d5-b1d4-b0fb2e53b992" (UID: "a583de89-e9f5-48d5-b1d4-b0fb2e53b992"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.167510 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config" (OuterVolumeSpecName: "config") pod "a583de89-e9f5-48d5-b1d4-b0fb2e53b992" (UID: "a583de89-e9f5-48d5-b1d4-b0fb2e53b992"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.179837 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.179883 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7pzv\" (UniqueName: \"kubernetes.io/projected/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-kube-api-access-c7pzv\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.179899 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.179911 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.179924 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a583de89-e9f5-48d5-b1d4-b0fb2e53b992-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.604307 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.604304 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845dd45477-hpv7l" event={"ID":"a583de89-e9f5-48d5-b1d4-b0fb2e53b992","Type":"ContainerDied","Data":"d2d093ca8427026931bf72dafa0ce7b90a79ad664cb8c9989e8b0704a6099354"} Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.604373 4929 scope.go:117] "RemoveContainer" containerID="e45b9445689c4fcf44fd21aee8563add036056f6a912455e33afbb42cbe11d08" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.626563 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.632908 4929 scope.go:117] "RemoveContainer" containerID="408a8aeac3418e7e6599d90c9c52646e1a0f45146765e2257eab73034d547c02" Oct 02 12:57:42 crc kubenswrapper[4929]: I1002 12:57:42.634452 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845dd45477-hpv7l"] Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.173002 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" path="/var/lib/kubelet/pods/a583de89-e9f5-48d5-b1d4-b0fb2e53b992/volumes" Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.367787 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.368471 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-central-agent" containerID="cri-o://01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7" gracePeriod=30 Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.368522 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="proxy-httpd" containerID="cri-o://8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b" gracePeriod=30 Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.368785 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-notification-agent" containerID="cri-o://10b439a0a645cb081558cef9cfb8bb121961b27f2f4d130bca081757b9486554" gracePeriod=30 Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.368787 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="sg-core" containerID="cri-o://193d592b7ab16a68b9bf905ffb4243f6382e8cb51b86b708f2bb865d0b35c37c" gracePeriod=30 Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.630332 4929 generic.go:334] "Generic (PLEG): container finished" podID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerID="193d592b7ab16a68b9bf905ffb4243f6382e8cb51b86b708f2bb865d0b35c37c" exitCode=2 Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.630380 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerDied","Data":"193d592b7ab16a68b9bf905ffb4243f6382e8cb51b86b708f2bb865d0b35c37c"} Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.737234 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:57:44 crc kubenswrapper[4929]: I1002 12:57:44.737309 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:57:44 crc kubenswrapper[4929]: E1002 12:57:44.934989 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5eaee92_aa8f_4d95_9999_1e6fb016e5d7.slice/crio-conmon-8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5eaee92_aa8f_4d95_9999_1e6fb016e5d7.slice/crio-conmon-01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:57:45 crc kubenswrapper[4929]: I1002 12:57:45.642104 4929 generic.go:334] "Generic (PLEG): container finished" podID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerID="8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b" exitCode=0 Oct 02 12:57:45 crc kubenswrapper[4929]: I1002 12:57:45.642424 4929 generic.go:334] "Generic (PLEG): container finished" podID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerID="01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7" exitCode=0 Oct 02 12:57:45 crc kubenswrapper[4929]: I1002 12:57:45.642246 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerDied","Data":"8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b"} Oct 02 12:57:45 crc kubenswrapper[4929]: I1002 12:57:45.642463 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerDied","Data":"01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7"} Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.681594 4929 generic.go:334] "Generic (PLEG): container finished" podID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerID="10b439a0a645cb081558cef9cfb8bb121961b27f2f4d130bca081757b9486554" exitCode=0 Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.681941 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerDied","Data":"10b439a0a645cb081558cef9cfb8bb121961b27f2f4d130bca081757b9486554"} Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.833378 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.936070 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.936507 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.936647 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.936899 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqc8c\" (UniqueName: \"kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.937212 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.937342 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.937620 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd\") pod \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\" (UID: \"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7\") " Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.938819 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.939026 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.940232 4929 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.940261 4929 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.944331 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c" (OuterVolumeSpecName: "kube-api-access-lqc8c") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "kube-api-access-lqc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.947169 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts" (OuterVolumeSpecName: "scripts") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:49 crc kubenswrapper[4929]: I1002 12:57:49.979490 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.033026 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.043789 4929 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.044206 4929 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.044268 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqc8c\" (UniqueName: \"kubernetes.io/projected/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-kube-api-access-lqc8c\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.044328 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.063443 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data" (OuterVolumeSpecName: "config-data") pod "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" (UID: "a5eaee92-aa8f-4d95-9999-1e6fb016e5d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.146368 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.692717 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a5eaee92-aa8f-4d95-9999-1e6fb016e5d7","Type":"ContainerDied","Data":"c0e2e2f229094afc8639093623397f427a87a11adc22f4a68978859096908a21"} Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.692789 4929 scope.go:117] "RemoveContainer" containerID="8b68d8848a2aa9508375ecd4adc4e924dc61dccd387be46f5bef0307dae46e8b" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.692819 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.714705 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.721773 4929 scope.go:117] "RemoveContainer" containerID="193d592b7ab16a68b9bf905ffb4243f6382e8cb51b86b708f2bb865d0b35c37c" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.723124 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745002 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745759 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="sg-core" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745786 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="sg-core" Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745837 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="dnsmasq-dns" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745847 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="dnsmasq-dns" Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745864 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-notification-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745872 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-notification-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745893 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="init" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745902 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="init" Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745921 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-central-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745930 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-central-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: E1002 12:57:50.745946 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="proxy-httpd" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.745969 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="proxy-httpd" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.746219 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="proxy-httpd" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.746245 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="sg-core" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.746276 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-central-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.746340 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a583de89-e9f5-48d5-b1d4-b0fb2e53b992" containerName="dnsmasq-dns" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.746353 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" containerName="ceilometer-notification-agent" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.749052 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.756003 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.759723 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.764705 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.766059 4929 scope.go:117] "RemoveContainer" containerID="10b439a0a645cb081558cef9cfb8bb121961b27f2f4d130bca081757b9486554" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.792791 4929 scope.go:117] "RemoveContainer" containerID="01d5667a345625e24286c28da9802fb2529d942c7cfc074954b447d263fd01f7" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.861304 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-run-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.861563 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.861822 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-log-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.861936 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.862087 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-scripts\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.862217 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjn8c\" (UniqueName: \"kubernetes.io/projected/2009a546-483b-41a2-97ef-2ceb4c8356e6-kube-api-access-kjn8c\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.862348 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-config-data\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.963940 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-log-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964245 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964393 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-scripts\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964530 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjn8c\" (UniqueName: \"kubernetes.io/projected/2009a546-483b-41a2-97ef-2ceb4c8356e6-kube-api-access-kjn8c\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964596 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-log-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964708 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-config-data\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.964988 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-run-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.965150 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.965758 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2009a546-483b-41a2-97ef-2ceb4c8356e6-run-httpd\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.971103 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.971207 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-config-data\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.971869 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-scripts\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.972451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2009a546-483b-41a2-97ef-2ceb4c8356e6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:50 crc kubenswrapper[4929]: I1002 12:57:50.982371 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjn8c\" (UniqueName: \"kubernetes.io/projected/2009a546-483b-41a2-97ef-2ceb4c8356e6-kube-api-access-kjn8c\") pod \"ceilometer-0\" (UID: \"2009a546-483b-41a2-97ef-2ceb4c8356e6\") " pod="openstack/ceilometer-0" Oct 02 12:57:51 crc kubenswrapper[4929]: I1002 12:57:51.073742 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:57:51 crc kubenswrapper[4929]: I1002 12:57:51.123145 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 12:57:51 crc kubenswrapper[4929]: I1002 12:57:51.585171 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:57:51 crc kubenswrapper[4929]: I1002 12:57:51.710192 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2009a546-483b-41a2-97ef-2ceb4c8356e6","Type":"ContainerStarted","Data":"e7249d6a29824cfa070eff919afc7e96d85f062cc47f47e44684fb7b59f13c17"} Oct 02 12:57:52 crc kubenswrapper[4929]: I1002 12:57:52.171015 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5eaee92-aa8f-4d95-9999-1e6fb016e5d7" path="/var/lib/kubelet/pods/a5eaee92-aa8f-4d95-9999-1e6fb016e5d7/volumes" Oct 02 12:57:52 crc kubenswrapper[4929]: I1002 12:57:52.708314 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 02 12:57:52 crc kubenswrapper[4929]: I1002 12:57:52.720854 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2009a546-483b-41a2-97ef-2ceb4c8356e6","Type":"ContainerStarted","Data":"68429fc0259c232b73600f5cc06a32a5276a56e90f682513b3e3277f58420df4"} Oct 02 12:57:52 crc kubenswrapper[4929]: I1002 12:57:52.967571 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 12:57:52 crc kubenswrapper[4929]: I1002 12:57:52.997085 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 12:57:54 crc kubenswrapper[4929]: I1002 12:57:54.755659 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2009a546-483b-41a2-97ef-2ceb4c8356e6","Type":"ContainerStarted","Data":"f41706a02e6001c98eac1dea7d054f929eb67c921ce4acc5c5eacb75aea941c8"} Oct 02 12:57:55 crc kubenswrapper[4929]: I1002 12:57:55.769319 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2009a546-483b-41a2-97ef-2ceb4c8356e6","Type":"ContainerStarted","Data":"b2be0f9f9cfd289f85030b4829059bd53cd21088cd16a98bd789eb8180b0ab90"} Oct 02 12:57:57 crc kubenswrapper[4929]: I1002 12:57:57.790366 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2009a546-483b-41a2-97ef-2ceb4c8356e6","Type":"ContainerStarted","Data":"0dd54061368ec33017f0561c7b6ea07665a50ad3308cbd29296b74fa4cde3ffe"} Oct 02 12:57:57 crc kubenswrapper[4929]: I1002 12:57:57.790877 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:57:57 crc kubenswrapper[4929]: I1002 12:57:57.814828 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.420593668 podStartE2EDuration="7.814805893s" podCreationTimestamp="2025-10-02 12:57:50 +0000 UTC" firstStartedPulling="2025-10-02 12:57:51.589897206 +0000 UTC m=+6472.140263570" lastFinishedPulling="2025-10-02 12:57:56.984109431 +0000 UTC m=+6477.534475795" observedRunningTime="2025-10-02 12:57:57.808563272 +0000 UTC m=+6478.358929636" watchObservedRunningTime="2025-10-02 12:57:57.814805893 +0000 UTC m=+6478.365172257" Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.736894 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.737699 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.737754 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.738796 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.738853 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa" gracePeriod=600 Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.979455 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa" exitCode=0 Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.979681 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa"} Oct 02 12:58:14 crc kubenswrapper[4929]: I1002 12:58:14.979799 4929 scope.go:117] "RemoveContainer" containerID="c91f9c8668a5ca2033c6910378d36184616f1624d79873a4f87a7ee5f6597df0" Oct 02 12:58:15 crc kubenswrapper[4929]: I1002 12:58:15.990851 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9"} Oct 02 12:58:21 crc kubenswrapper[4929]: I1002 12:58:21.084181 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:58:38 crc kubenswrapper[4929]: I1002 12:58:38.952101 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:58:38 crc kubenswrapper[4929]: I1002 12:58:38.954689 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:38 crc kubenswrapper[4929]: I1002 12:58:38.956623 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 02 12:58:38 crc kubenswrapper[4929]: I1002 12:58:38.975936 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.060718 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mpr\" (UniqueName: \"kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.061565 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.061626 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.061779 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.061811 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.061868 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164350 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164417 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164547 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164573 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164609 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.164667 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8mpr\" (UniqueName: \"kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.165427 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.165487 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.165636 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.165657 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.166036 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.189387 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8mpr\" (UniqueName: \"kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr\") pod \"dnsmasq-dns-68747465f7-qbg22\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.278041 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:39 crc kubenswrapper[4929]: I1002 12:58:39.871550 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:58:40 crc kubenswrapper[4929]: I1002 12:58:40.217546 4929 generic.go:334] "Generic (PLEG): container finished" podID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerID="d63a3e91bf7554c418d3e8100f29186d4a14bd6971482b350b4be0ceeae99915" exitCode=0 Oct 02 12:58:40 crc kubenswrapper[4929]: I1002 12:58:40.217867 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68747465f7-qbg22" event={"ID":"c6c521c8-dd31-421f-ab4e-059703a63bda","Type":"ContainerDied","Data":"d63a3e91bf7554c418d3e8100f29186d4a14bd6971482b350b4be0ceeae99915"} Oct 02 12:58:40 crc kubenswrapper[4929]: I1002 12:58:40.217895 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68747465f7-qbg22" event={"ID":"c6c521c8-dd31-421f-ab4e-059703a63bda","Type":"ContainerStarted","Data":"f1e5e8f7a2ad0bd188560c596254c41fd1b2a81c87b37100a3a0689e616922e4"} Oct 02 12:58:41 crc kubenswrapper[4929]: I1002 12:58:41.228715 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68747465f7-qbg22" event={"ID":"c6c521c8-dd31-421f-ab4e-059703a63bda","Type":"ContainerStarted","Data":"53ac1e449cea77d54708f0eb9eed0a8e68188e5152a7018108a627b58c4c5b2d"} Oct 02 12:58:41 crc kubenswrapper[4929]: I1002 12:58:41.229079 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:41 crc kubenswrapper[4929]: I1002 12:58:41.252231 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68747465f7-qbg22" podStartSLOduration=3.2522111049999998 podStartE2EDuration="3.252211105s" podCreationTimestamp="2025-10-02 12:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:58:41.243887275 +0000 UTC m=+6521.794253639" watchObservedRunningTime="2025-10-02 12:58:41.252211105 +0000 UTC m=+6521.802577469" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.279922 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.334938 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.335264 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="dnsmasq-dns" containerID="cri-o://98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288" gracePeriod=10 Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.476249 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c74f8487-xpcth"] Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.479790 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.513051 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c74f8487-xpcth"] Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.606738 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-config\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.606786 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-openstack-cell1\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.606824 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-dns-svc\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.606986 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.607129 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srdg\" (UniqueName: \"kubernetes.io/projected/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-kube-api-access-4srdg\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.607469 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.709854 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-config\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.709924 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-openstack-cell1\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.709990 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-dns-svc\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.711353 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-dns-svc\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.711560 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-openstack-cell1\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.710033 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.711676 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srdg\" (UniqueName: \"kubernetes.io/projected/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-kube-api-access-4srdg\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.711878 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.711896 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.712666 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.717434 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-config\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.732895 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srdg\" (UniqueName: \"kubernetes.io/projected/b0f16e97-68b0-4528-a176-ba1c3a8e95cb-kube-api-access-4srdg\") pod \"dnsmasq-dns-79c74f8487-xpcth\" (UID: \"b0f16e97-68b0-4528-a176-ba1c3a8e95cb\") " pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:49 crc kubenswrapper[4929]: I1002 12:58:49.844385 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.008394 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.136023 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config\") pod \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.136352 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb\") pod \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.136573 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb\") pod \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.136616 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc\") pod \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.136663 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzfhk\" (UniqueName: \"kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk\") pod \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\" (UID: \"ad696f41-0dc4-4f9a-b099-c0255ebbe450\") " Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.144183 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk" (OuterVolumeSpecName: "kube-api-access-nzfhk") pod "ad696f41-0dc4-4f9a-b099-c0255ebbe450" (UID: "ad696f41-0dc4-4f9a-b099-c0255ebbe450"). InnerVolumeSpecName "kube-api-access-nzfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.195173 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad696f41-0dc4-4f9a-b099-c0255ebbe450" (UID: "ad696f41-0dc4-4f9a-b099-c0255ebbe450"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.205282 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config" (OuterVolumeSpecName: "config") pod "ad696f41-0dc4-4f9a-b099-c0255ebbe450" (UID: "ad696f41-0dc4-4f9a-b099-c0255ebbe450"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.208290 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad696f41-0dc4-4f9a-b099-c0255ebbe450" (UID: "ad696f41-0dc4-4f9a-b099-c0255ebbe450"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.228325 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad696f41-0dc4-4f9a-b099-c0255ebbe450" (UID: "ad696f41-0dc4-4f9a-b099-c0255ebbe450"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.240534 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.240565 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.240576 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzfhk\" (UniqueName: \"kubernetes.io/projected/ad696f41-0dc4-4f9a-b099-c0255ebbe450-kube-api-access-nzfhk\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.240588 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.240600 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad696f41-0dc4-4f9a-b099-c0255ebbe450-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.331625 4929 generic.go:334] "Generic (PLEG): container finished" podID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerID="98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288" exitCode=0 Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.331679 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" event={"ID":"ad696f41-0dc4-4f9a-b099-c0255ebbe450","Type":"ContainerDied","Data":"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288"} Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.331713 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" event={"ID":"ad696f41-0dc4-4f9a-b099-c0255ebbe450","Type":"ContainerDied","Data":"0cb365334b6d7e77672239ac6a11a12c874bb6203b65e762c38fcb65b55877a5"} Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.331735 4929 scope.go:117] "RemoveContainer" containerID="98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.331855 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fd8d7f5-h8q8n" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.357015 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c74f8487-xpcth"] Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.371348 4929 scope.go:117] "RemoveContainer" containerID="0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.375789 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.387354 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64fd8d7f5-h8q8n"] Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.396188 4929 scope.go:117] "RemoveContainer" containerID="98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288" Oct 02 12:58:50 crc kubenswrapper[4929]: E1002 12:58:50.396622 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288\": container with ID starting with 98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288 not found: ID does not exist" containerID="98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.396654 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288"} err="failed to get container status \"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288\": rpc error: code = NotFound desc = could not find container \"98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288\": container with ID starting with 98eb6142bee83e2ebd789517a5f2c50c90bb4f50cfd15b84d42a809435334288 not found: ID does not exist" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.396677 4929 scope.go:117] "RemoveContainer" containerID="0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1" Oct 02 12:58:50 crc kubenswrapper[4929]: E1002 12:58:50.397235 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1\": container with ID starting with 0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1 not found: ID does not exist" containerID="0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1" Oct 02 12:58:50 crc kubenswrapper[4929]: I1002 12:58:50.397298 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1"} err="failed to get container status \"0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1\": rpc error: code = NotFound desc = could not find container \"0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1\": container with ID starting with 0c14652720de61c2bafb1e51c81ffe99297df1e4d40d730b80d32fc93a258db1 not found: ID does not exist" Oct 02 12:58:51 crc kubenswrapper[4929]: I1002 12:58:51.343616 4929 generic.go:334] "Generic (PLEG): container finished" podID="b0f16e97-68b0-4528-a176-ba1c3a8e95cb" containerID="989b30d32bedde045abf46855a91ba69ac8246fb78f04458c45edbf233484750" exitCode=0 Oct 02 12:58:51 crc kubenswrapper[4929]: I1002 12:58:51.343680 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" event={"ID":"b0f16e97-68b0-4528-a176-ba1c3a8e95cb","Type":"ContainerDied","Data":"989b30d32bedde045abf46855a91ba69ac8246fb78f04458c45edbf233484750"} Oct 02 12:58:51 crc kubenswrapper[4929]: I1002 12:58:51.344039 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" event={"ID":"b0f16e97-68b0-4528-a176-ba1c3a8e95cb","Type":"ContainerStarted","Data":"42e22184fbda0c61346b1be6cf630f72579adcc9f106ec964f3995f9bff545e7"} Oct 02 12:58:52 crc kubenswrapper[4929]: I1002 12:58:52.170723 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" path="/var/lib/kubelet/pods/ad696f41-0dc4-4f9a-b099-c0255ebbe450/volumes" Oct 02 12:58:52 crc kubenswrapper[4929]: I1002 12:58:52.359795 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" event={"ID":"b0f16e97-68b0-4528-a176-ba1c3a8e95cb","Type":"ContainerStarted","Data":"fcfb8cf8303127c01089e8009d0bdf506b1c653200564ae3c16ff42120a82d47"} Oct 02 12:58:52 crc kubenswrapper[4929]: I1002 12:58:52.359990 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:59 crc kubenswrapper[4929]: I1002 12:58:59.847599 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" Oct 02 12:58:59 crc kubenswrapper[4929]: I1002 12:58:59.865224 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79c74f8487-xpcth" podStartSLOduration=10.865207142 podStartE2EDuration="10.865207142s" podCreationTimestamp="2025-10-02 12:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:58:52.386985287 +0000 UTC m=+6532.937351661" watchObservedRunningTime="2025-10-02 12:58:59.865207142 +0000 UTC m=+6540.415573506" Oct 02 12:58:59 crc kubenswrapper[4929]: I1002 12:58:59.921138 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:58:59 crc kubenswrapper[4929]: I1002 12:58:59.921401 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68747465f7-qbg22" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="dnsmasq-dns" containerID="cri-o://53ac1e449cea77d54708f0eb9eed0a8e68188e5152a7018108a627b58c4c5b2d" gracePeriod=10 Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.436149 4929 generic.go:334] "Generic (PLEG): container finished" podID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerID="53ac1e449cea77d54708f0eb9eed0a8e68188e5152a7018108a627b58c4c5b2d" exitCode=0 Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.437105 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68747465f7-qbg22" event={"ID":"c6c521c8-dd31-421f-ab4e-059703a63bda","Type":"ContainerDied","Data":"53ac1e449cea77d54708f0eb9eed0a8e68188e5152a7018108a627b58c4c5b2d"} Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.437157 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68747465f7-qbg22" event={"ID":"c6c521c8-dd31-421f-ab4e-059703a63bda","Type":"ContainerDied","Data":"f1e5e8f7a2ad0bd188560c596254c41fd1b2a81c87b37100a3a0689e616922e4"} Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.437172 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e5e8f7a2ad0bd188560c596254c41fd1b2a81c87b37100a3a0689e616922e4" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.486213 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581264 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581331 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581353 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581382 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8mpr\" (UniqueName: \"kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581539 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.581666 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config\") pod \"c6c521c8-dd31-421f-ab4e-059703a63bda\" (UID: \"c6c521c8-dd31-421f-ab4e-059703a63bda\") " Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.607021 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr" (OuterVolumeSpecName: "kube-api-access-w8mpr") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "kube-api-access-w8mpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.642595 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.653851 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.654518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.654670 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config" (OuterVolumeSpecName: "config") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.659192 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6c521c8-dd31-421f-ab4e-059703a63bda" (UID: "c6c521c8-dd31-421f-ab4e-059703a63bda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683667 4929 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683703 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683714 4929 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683726 4929 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683736 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8mpr\" (UniqueName: \"kubernetes.io/projected/c6c521c8-dd31-421f-ab4e-059703a63bda-kube-api-access-w8mpr\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:00 crc kubenswrapper[4929]: I1002 12:59:00.683745 4929 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6c521c8-dd31-421f-ab4e-059703a63bda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:01 crc kubenswrapper[4929]: I1002 12:59:01.444803 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68747465f7-qbg22" Oct 02 12:59:01 crc kubenswrapper[4929]: I1002 12:59:01.477269 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:59:01 crc kubenswrapper[4929]: I1002 12:59:01.486259 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68747465f7-qbg22"] Oct 02 12:59:02 crc kubenswrapper[4929]: I1002 12:59:02.040374 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-4kqts"] Oct 02 12:59:02 crc kubenswrapper[4929]: I1002 12:59:02.051250 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-4kqts"] Oct 02 12:59:02 crc kubenswrapper[4929]: I1002 12:59:02.173173 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9c92bb-12e4-4a69-ba64-217902786770" path="/var/lib/kubelet/pods/8a9c92bb-12e4-4a69-ba64-217902786770/volumes" Oct 02 12:59:02 crc kubenswrapper[4929]: I1002 12:59:02.174268 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" path="/var/lib/kubelet/pods/c6c521c8-dd31-421f-ab4e-059703a63bda/volumes" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.766599 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj"] Oct 02 12:59:10 crc kubenswrapper[4929]: E1002 12:59:10.767862 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="init" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.767879 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="init" Oct 02 12:59:10 crc kubenswrapper[4929]: E1002 12:59:10.767904 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="init" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.767912 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="init" Oct 02 12:59:10 crc kubenswrapper[4929]: E1002 12:59:10.767929 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.767937 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: E1002 12:59:10.767973 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.767983 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.768213 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad696f41-0dc4-4f9a-b099-c0255ebbe450" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.768241 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c521c8-dd31-421f-ab4e-059703a63bda" containerName="dnsmasq-dns" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.769296 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.771977 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.772184 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.772202 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.772644 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.784518 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj"] Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.893005 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.893108 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db5dk\" (UniqueName: \"kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.893160 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.893323 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.893401 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.995537 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.996408 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.996564 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.996669 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db5dk\" (UniqueName: \"kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:10 crc kubenswrapper[4929]: I1002 12:59:10.996760 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.001760 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.006803 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.010558 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.010736 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.014439 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db5dk\" (UniqueName: \"kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.114638 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:11 crc kubenswrapper[4929]: I1002 12:59:11.722164 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj"] Oct 02 12:59:12 crc kubenswrapper[4929]: I1002 12:59:12.545504 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" event={"ID":"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027","Type":"ContainerStarted","Data":"2e6e9e979a5317e4f86a740441bc5530c84dfcb2a266f9afead574a7c8e5afa5"} Oct 02 12:59:14 crc kubenswrapper[4929]: I1002 12:59:14.031429 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-67ef-account-create-5l7xv"] Oct 02 12:59:14 crc kubenswrapper[4929]: I1002 12:59:14.041103 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-67ef-account-create-5l7xv"] Oct 02 12:59:14 crc kubenswrapper[4929]: I1002 12:59:14.181091 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3356ee2b-8c66-4380-ae15-60353943d39f" path="/var/lib/kubelet/pods/3356ee2b-8c66-4380-ae15-60353943d39f/volumes" Oct 02 12:59:20 crc kubenswrapper[4929]: I1002 12:59:20.037126 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-jxk5j"] Oct 02 12:59:20 crc kubenswrapper[4929]: I1002 12:59:20.058399 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-jxk5j"] Oct 02 12:59:20 crc kubenswrapper[4929]: I1002 12:59:20.189993 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7e6ee8-1706-4e44-a7e4-a932a2bbd847" path="/var/lib/kubelet/pods/1e7e6ee8-1706-4e44-a7e4-a932a2bbd847/volumes" Oct 02 12:59:24 crc kubenswrapper[4929]: I1002 12:59:24.685977 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" event={"ID":"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027","Type":"ContainerStarted","Data":"b347a62cf6ddb8c47b9940b2e24968eb65ccb98151ca316fe4b83a402f064200"} Oct 02 12:59:24 crc kubenswrapper[4929]: I1002 12:59:24.710568 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" podStartSLOduration=2.755723484 podStartE2EDuration="14.71054964s" podCreationTimestamp="2025-10-02 12:59:10 +0000 UTC" firstStartedPulling="2025-10-02 12:59:11.73036753 +0000 UTC m=+6552.280733894" lastFinishedPulling="2025-10-02 12:59:23.685193686 +0000 UTC m=+6564.235560050" observedRunningTime="2025-10-02 12:59:24.705238937 +0000 UTC m=+6565.255605301" watchObservedRunningTime="2025-10-02 12:59:24.71054964 +0000 UTC m=+6565.260916004" Oct 02 12:59:26 crc kubenswrapper[4929]: I1002 12:59:26.636717 4929 scope.go:117] "RemoveContainer" containerID="7c63ca0c526a29a6c0d15658320ef045d7311a910bf92a13dc5f952159bc0852" Oct 02 12:59:26 crc kubenswrapper[4929]: I1002 12:59:26.818205 4929 scope.go:117] "RemoveContainer" containerID="39eb23fb45a461061f126955d6db4b7ae637b25c3cdea476668d967ab4911258" Oct 02 12:59:26 crc kubenswrapper[4929]: I1002 12:59:26.843145 4929 scope.go:117] "RemoveContainer" containerID="6cc546dbf828b7531d876935e21f75bda40766a034ed98564be7367940f44004" Oct 02 12:59:26 crc kubenswrapper[4929]: I1002 12:59:26.897613 4929 scope.go:117] "RemoveContainer" containerID="695d8e4de01a2142e05074a7c059e6c674c40b111e7065b037edd1f2f625bcd4" Oct 02 12:59:26 crc kubenswrapper[4929]: I1002 12:59:26.924007 4929 scope.go:117] "RemoveContainer" containerID="618fbc37d8d67855a8c79f70c2824a3d62c20ae082fcfe5b4090dee16ff8c893" Oct 02 12:59:31 crc kubenswrapper[4929]: I1002 12:59:31.033488 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-2514-account-create-6sxwx"] Oct 02 12:59:31 crc kubenswrapper[4929]: I1002 12:59:31.043466 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-2514-account-create-6sxwx"] Oct 02 12:59:32 crc kubenswrapper[4929]: I1002 12:59:32.171649 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e3189c-a438-45f6-bce7-8fa19475d892" path="/var/lib/kubelet/pods/c7e3189c-a438-45f6-bce7-8fa19475d892/volumes" Oct 02 12:59:37 crc kubenswrapper[4929]: I1002 12:59:37.808112 4929 generic.go:334] "Generic (PLEG): container finished" podID="f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" containerID="b347a62cf6ddb8c47b9940b2e24968eb65ccb98151ca316fe4b83a402f064200" exitCode=0 Oct 02 12:59:37 crc kubenswrapper[4929]: I1002 12:59:37.808206 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" event={"ID":"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027","Type":"ContainerDied","Data":"b347a62cf6ddb8c47b9940b2e24968eb65ccb98151ca316fe4b83a402f064200"} Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.293667 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.338606 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.338689 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.338782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db5dk\" (UniqueName: \"kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.338865 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.339197 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.348475 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph" (OuterVolumeSpecName: "ceph") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.348539 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.348697 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk" (OuterVolumeSpecName: "kube-api-access-db5dk") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027"). InnerVolumeSpecName "kube-api-access-db5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:39 crc kubenswrapper[4929]: E1002 12:59:39.376304 4929 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key podName:f83da3fa-a3f5-4a4e-a3b0-a8908d72a027 nodeName:}" failed. No retries permitted until 2025-10-02 12:59:39.876272336 +0000 UTC m=+6580.426638700 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027") : error deleting /var/lib/kubelet/pods/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027/volume-subpaths: remove /var/lib/kubelet/pods/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027/volume-subpaths: no such file or directory Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.379692 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory" (OuterVolumeSpecName: "inventory") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.441917 4929 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.441986 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.441998 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db5dk\" (UniqueName: \"kubernetes.io/projected/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-kube-api-access-db5dk\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.442009 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.830694 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" event={"ID":"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027","Type":"ContainerDied","Data":"2e6e9e979a5317e4f86a740441bc5530c84dfcb2a266f9afead574a7c8e5afa5"} Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.831230 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6e9e979a5317e4f86a740441bc5530c84dfcb2a266f9afead574a7c8e5afa5" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.831340 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj" Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.954911 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") pod \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\" (UID: \"f83da3fa-a3f5-4a4e-a3b0-a8908d72a027\") " Oct 02 12:59:39 crc kubenswrapper[4929]: I1002 12:59:39.972061 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" (UID: "f83da3fa-a3f5-4a4e-a3b0-a8908d72a027"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:40 crc kubenswrapper[4929]: I1002 12:59:40.057102 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f83da3fa-a3f5-4a4e-a3b0-a8908d72a027-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.688777 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw"] Oct 02 12:59:43 crc kubenswrapper[4929]: E1002 12:59:43.689837 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.689854 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.690178 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83da3fa-a3f5-4a4e-a3b0-a8908d72a027" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.691093 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.693562 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.693788 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.693996 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.695423 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.700708 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw"] Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.727359 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.727415 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k644\" (UniqueName: \"kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.727491 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.727512 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.727622 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.829400 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.829483 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.829508 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k644\" (UniqueName: \"kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.829575 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.830060 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.837927 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.843855 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.843907 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.845043 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:43 crc kubenswrapper[4929]: I1002 12:59:43.852477 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k644\" (UniqueName: \"kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:44 crc kubenswrapper[4929]: I1002 12:59:44.024529 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 12:59:44 crc kubenswrapper[4929]: I1002 12:59:44.575242 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw"] Oct 02 12:59:44 crc kubenswrapper[4929]: I1002 12:59:44.880903 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" event={"ID":"4bff17f1-9395-47c4-a706-a0adf7745c50","Type":"ContainerStarted","Data":"806664fab10e9d4d8e471a1b215c061de5d1c7f6c206f5f7ca6f62a8a08bab3f"} Oct 02 12:59:45 crc kubenswrapper[4929]: I1002 12:59:45.894646 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" event={"ID":"4bff17f1-9395-47c4-a706-a0adf7745c50","Type":"ContainerStarted","Data":"bb0bbbbcfd0261fb8a50b2809de5daf571ac3064e72a02e58c7fe7f69dd72ff7"} Oct 02 12:59:45 crc kubenswrapper[4929]: I1002 12:59:45.913447 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" podStartSLOduration=2.389130198 podStartE2EDuration="2.91342964s" podCreationTimestamp="2025-10-02 12:59:43 +0000 UTC" firstStartedPulling="2025-10-02 12:59:44.581678027 +0000 UTC m=+6585.132044391" lastFinishedPulling="2025-10-02 12:59:45.105977469 +0000 UTC m=+6585.656343833" observedRunningTime="2025-10-02 12:59:45.908078686 +0000 UTC m=+6586.458445050" watchObservedRunningTime="2025-10-02 12:59:45.91342964 +0000 UTC m=+6586.463796014" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.206214 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t"] Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.208252 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.211068 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t"] Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.212300 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.212344 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.397387 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.397552 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.397644 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6bh\" (UniqueName: \"kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.500082 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6bh\" (UniqueName: \"kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.500507 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.500655 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.501871 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.507008 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.518728 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6bh\" (UniqueName: \"kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh\") pod \"collect-profiles-29323500-2944t\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:00 crc kubenswrapper[4929]: I1002 13:00:00.543292 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:01 crc kubenswrapper[4929]: I1002 13:00:01.007412 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t"] Oct 02 13:00:01 crc kubenswrapper[4929]: I1002 13:00:01.046197 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" event={"ID":"3a98dc16-e89b-4248-96b9-82977c83f774","Type":"ContainerStarted","Data":"1101397ad8e1ff160f203bbc64a40924efd0d4b503f1f31f48c93a0966787009"} Oct 02 13:00:02 crc kubenswrapper[4929]: I1002 13:00:02.057946 4929 generic.go:334] "Generic (PLEG): container finished" podID="3a98dc16-e89b-4248-96b9-82977c83f774" containerID="e46c693039690c00c623331a534fdbe193db84623fa2e1c5bf9c099e43a7947b" exitCode=0 Oct 02 13:00:02 crc kubenswrapper[4929]: I1002 13:00:02.058104 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" event={"ID":"3a98dc16-e89b-4248-96b9-82977c83f774","Type":"ContainerDied","Data":"e46c693039690c00c623331a534fdbe193db84623fa2e1c5bf9c099e43a7947b"} Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.470315 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.672285 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc6bh\" (UniqueName: \"kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh\") pod \"3a98dc16-e89b-4248-96b9-82977c83f774\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.672373 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume\") pod \"3a98dc16-e89b-4248-96b9-82977c83f774\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.672683 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume\") pod \"3a98dc16-e89b-4248-96b9-82977c83f774\" (UID: \"3a98dc16-e89b-4248-96b9-82977c83f774\") " Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.673023 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a98dc16-e89b-4248-96b9-82977c83f774" (UID: "3a98dc16-e89b-4248-96b9-82977c83f774"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.673658 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a98dc16-e89b-4248-96b9-82977c83f774-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.678806 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh" (OuterVolumeSpecName: "kube-api-access-fc6bh") pod "3a98dc16-e89b-4248-96b9-82977c83f774" (UID: "3a98dc16-e89b-4248-96b9-82977c83f774"). InnerVolumeSpecName "kube-api-access-fc6bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.681204 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a98dc16-e89b-4248-96b9-82977c83f774" (UID: "3a98dc16-e89b-4248-96b9-82977c83f774"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.775799 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a98dc16-e89b-4248-96b9-82977c83f774-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4929]: I1002 13:00:03.775842 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc6bh\" (UniqueName: \"kubernetes.io/projected/3a98dc16-e89b-4248-96b9-82977c83f774-kube-api-access-fc6bh\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:04 crc kubenswrapper[4929]: I1002 13:00:04.075907 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" event={"ID":"3a98dc16-e89b-4248-96b9-82977c83f774","Type":"ContainerDied","Data":"1101397ad8e1ff160f203bbc64a40924efd0d4b503f1f31f48c93a0966787009"} Oct 02 13:00:04 crc kubenswrapper[4929]: I1002 13:00:04.075981 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1101397ad8e1ff160f203bbc64a40924efd0d4b503f1f31f48c93a0966787009" Oct 02 13:00:04 crc kubenswrapper[4929]: I1002 13:00:04.076034 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t" Oct 02 13:00:04 crc kubenswrapper[4929]: I1002 13:00:04.551312 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v"] Oct 02 13:00:04 crc kubenswrapper[4929]: I1002 13:00:04.560280 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-9pt2v"] Oct 02 13:00:06 crc kubenswrapper[4929]: I1002 13:00:06.169347 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b28d668-46df-4df8-b309-7d867c16dfe8" path="/var/lib/kubelet/pods/0b28d668-46df-4df8-b309-7d867c16dfe8/volumes" Oct 02 13:00:27 crc kubenswrapper[4929]: I1002 13:00:27.105819 4929 scope.go:117] "RemoveContainer" containerID="d5c91ada8cb08d807c5b628888ca43273aeeb9e86e739be2f7625d976861f8f0" Oct 02 13:00:27 crc kubenswrapper[4929]: I1002 13:00:27.136084 4929 scope.go:117] "RemoveContainer" containerID="7de53dfe819b9bc511537b8f984471578530b9adc639283c59fb8c4e2f09d8a7" Oct 02 13:00:33 crc kubenswrapper[4929]: I1002 13:00:33.042400 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-crswf"] Oct 02 13:00:33 crc kubenswrapper[4929]: I1002 13:00:33.053919 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-crswf"] Oct 02 13:00:34 crc kubenswrapper[4929]: I1002 13:00:34.170124 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ae9256-6d03-4694-b34f-246bff68e4f7" path="/var/lib/kubelet/pods/62ae9256-6d03-4694-b34f-246bff68e4f7/volumes" Oct 02 13:00:44 crc kubenswrapper[4929]: I1002 13:00:44.737404 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:00:44 crc kubenswrapper[4929]: I1002 13:00:44.737965 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.151687 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323501-f9cmf"] Oct 02 13:01:00 crc kubenswrapper[4929]: E1002 13:01:00.154858 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a98dc16-e89b-4248-96b9-82977c83f774" containerName="collect-profiles" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.154989 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a98dc16-e89b-4248-96b9-82977c83f774" containerName="collect-profiles" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.155854 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a98dc16-e89b-4248-96b9-82977c83f774" containerName="collect-profiles" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.157380 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.199881 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvw6\" (UniqueName: \"kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.200027 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.200277 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.200382 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.207630 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-f9cmf"] Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.302359 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvw6\" (UniqueName: \"kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.302453 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.302538 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.302569 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.309159 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.312312 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.318292 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.320088 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvw6\" (UniqueName: \"kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6\") pod \"keystone-cron-29323501-f9cmf\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:00 crc kubenswrapper[4929]: I1002 13:01:00.541752 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:01 crc kubenswrapper[4929]: I1002 13:01:01.057294 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-f9cmf"] Oct 02 13:01:01 crc kubenswrapper[4929]: I1002 13:01:01.650624 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-f9cmf" event={"ID":"ec8358db-3002-4ea6-8bed-c92d946761c4","Type":"ContainerStarted","Data":"c883b7a97b405b19e784b78f6b4826fce31a63aabcbc047dc92a69592129e5dc"} Oct 02 13:01:01 crc kubenswrapper[4929]: I1002 13:01:01.650986 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-f9cmf" event={"ID":"ec8358db-3002-4ea6-8bed-c92d946761c4","Type":"ContainerStarted","Data":"bd2f393f8083614db9e549352a3ffbee32de75898ae9363203e27dc414aa30df"} Oct 02 13:01:01 crc kubenswrapper[4929]: I1002 13:01:01.665714 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323501-f9cmf" podStartSLOduration=1.665692805 podStartE2EDuration="1.665692805s" podCreationTimestamp="2025-10-02 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:01.663933385 +0000 UTC m=+6662.214299759" watchObservedRunningTime="2025-10-02 13:01:01.665692805 +0000 UTC m=+6662.216059169" Oct 02 13:01:05 crc kubenswrapper[4929]: I1002 13:01:05.690021 4929 generic.go:334] "Generic (PLEG): container finished" podID="ec8358db-3002-4ea6-8bed-c92d946761c4" containerID="c883b7a97b405b19e784b78f6b4826fce31a63aabcbc047dc92a69592129e5dc" exitCode=0 Oct 02 13:01:05 crc kubenswrapper[4929]: I1002 13:01:05.690105 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-f9cmf" event={"ID":"ec8358db-3002-4ea6-8bed-c92d946761c4","Type":"ContainerDied","Data":"c883b7a97b405b19e784b78f6b4826fce31a63aabcbc047dc92a69592129e5dc"} Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.102843 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.159630 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys\") pod \"ec8358db-3002-4ea6-8bed-c92d946761c4\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.159836 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bvw6\" (UniqueName: \"kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6\") pod \"ec8358db-3002-4ea6-8bed-c92d946761c4\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.159934 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle\") pod \"ec8358db-3002-4ea6-8bed-c92d946761c4\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.159975 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data\") pod \"ec8358db-3002-4ea6-8bed-c92d946761c4\" (UID: \"ec8358db-3002-4ea6-8bed-c92d946761c4\") " Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.167462 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec8358db-3002-4ea6-8bed-c92d946761c4" (UID: "ec8358db-3002-4ea6-8bed-c92d946761c4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.167498 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6" (OuterVolumeSpecName: "kube-api-access-4bvw6") pod "ec8358db-3002-4ea6-8bed-c92d946761c4" (UID: "ec8358db-3002-4ea6-8bed-c92d946761c4"). InnerVolumeSpecName "kube-api-access-4bvw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.191484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8358db-3002-4ea6-8bed-c92d946761c4" (UID: "ec8358db-3002-4ea6-8bed-c92d946761c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.219714 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data" (OuterVolumeSpecName: "config-data") pod "ec8358db-3002-4ea6-8bed-c92d946761c4" (UID: "ec8358db-3002-4ea6-8bed-c92d946761c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.263213 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.263532 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.263640 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec8358db-3002-4ea6-8bed-c92d946761c4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.263756 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bvw6\" (UniqueName: \"kubernetes.io/projected/ec8358db-3002-4ea6-8bed-c92d946761c4-kube-api-access-4bvw6\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.713708 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-f9cmf" event={"ID":"ec8358db-3002-4ea6-8bed-c92d946761c4","Type":"ContainerDied","Data":"bd2f393f8083614db9e549352a3ffbee32de75898ae9363203e27dc414aa30df"} Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.713780 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2f393f8083614db9e549352a3ffbee32de75898ae9363203e27dc414aa30df" Oct 02 13:01:07 crc kubenswrapper[4929]: I1002 13:01:07.713895 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-f9cmf" Oct 02 13:01:14 crc kubenswrapper[4929]: I1002 13:01:14.736757 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:01:14 crc kubenswrapper[4929]: I1002 13:01:14.737544 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:01:27 crc kubenswrapper[4929]: I1002 13:01:27.254065 4929 scope.go:117] "RemoveContainer" containerID="1ea5b93107eee9f0fbd5f852bf0e01dfb6d184746e7f01efbb830c1793f323af" Oct 02 13:01:27 crc kubenswrapper[4929]: I1002 13:01:27.287014 4929 scope.go:117] "RemoveContainer" containerID="cf967af275e2f7144520bd582eb0ba82933ea94ea5996aced98b61dc4a2650bd" Oct 02 13:01:44 crc kubenswrapper[4929]: I1002 13:01:44.737060 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:01:44 crc kubenswrapper[4929]: I1002 13:01:44.738088 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:01:44 crc kubenswrapper[4929]: I1002 13:01:44.738172 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:01:44 crc kubenswrapper[4929]: I1002 13:01:44.739652 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:01:44 crc kubenswrapper[4929]: I1002 13:01:44.739743 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" gracePeriod=600 Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.086743 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" exitCode=0 Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.086815 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9"} Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.087221 4929 scope.go:117] "RemoveContainer" containerID="71dbe36c7a6d8e09fe4ce647ebe551abca28a81abef88dabeb8a84825d9cf7fa" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.127313 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:01:45 crc kubenswrapper[4929]: E1002 13:01:45.127931 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8358db-3002-4ea6-8bed-c92d946761c4" containerName="keystone-cron" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.127975 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8358db-3002-4ea6-8bed-c92d946761c4" containerName="keystone-cron" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.128191 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8358db-3002-4ea6-8bed-c92d946761c4" containerName="keystone-cron" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.129871 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.136410 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.144004 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjq7\" (UniqueName: \"kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.144080 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.144107 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.247355 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjq7\" (UniqueName: \"kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.247442 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.247475 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.247874 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.247986 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.272937 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjq7\" (UniqueName: \"kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7\") pod \"certified-operators-dvxht\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: E1002 13:01:45.380838 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.454896 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:45 crc kubenswrapper[4929]: I1002 13:01:45.945641 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:01:46 crc kubenswrapper[4929]: I1002 13:01:46.097292 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerStarted","Data":"e11c6eed12a5f15907d8d9fc40080f17bf7ff16f9f5d581442cb5746d6e32a9b"} Oct 02 13:01:46 crc kubenswrapper[4929]: I1002 13:01:46.100258 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:01:46 crc kubenswrapper[4929]: E1002 13:01:46.100588 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:01:47 crc kubenswrapper[4929]: I1002 13:01:47.119360 4929 generic.go:334] "Generic (PLEG): container finished" podID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerID="1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254" exitCode=0 Oct 02 13:01:47 crc kubenswrapper[4929]: I1002 13:01:47.119415 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerDied","Data":"1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254"} Oct 02 13:01:49 crc kubenswrapper[4929]: I1002 13:01:49.139376 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerStarted","Data":"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2"} Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.516911 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.520485 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.548780 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.659184 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.659784 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mmx\" (UniqueName: \"kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.660338 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.763346 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.763407 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mmx\" (UniqueName: \"kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.763543 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.764108 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.764262 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.787187 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mmx\" (UniqueName: \"kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx\") pod \"redhat-operators-4drhw\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:50 crc kubenswrapper[4929]: I1002 13:01:50.860138 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:01:51 crc kubenswrapper[4929]: I1002 13:01:51.181620 4929 generic.go:334] "Generic (PLEG): container finished" podID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerID="17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2" exitCode=0 Oct 02 13:01:51 crc kubenswrapper[4929]: I1002 13:01:51.181698 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerDied","Data":"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2"} Oct 02 13:01:51 crc kubenswrapper[4929]: I1002 13:01:51.366456 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:01:51 crc kubenswrapper[4929]: W1002 13:01:51.370385 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c7b69c7_8624_4e78_a832_7d0a5d200f55.slice/crio-acd742b52ab7a99c1ae015562d26e85b03ec3006e8667361e401c7afab4c9f62 WatchSource:0}: Error finding container acd742b52ab7a99c1ae015562d26e85b03ec3006e8667361e401c7afab4c9f62: Status 404 returned error can't find the container with id acd742b52ab7a99c1ae015562d26e85b03ec3006e8667361e401c7afab4c9f62 Oct 02 13:01:52 crc kubenswrapper[4929]: I1002 13:01:52.194828 4929 generic.go:334] "Generic (PLEG): container finished" podID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerID="14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4" exitCode=0 Oct 02 13:01:52 crc kubenswrapper[4929]: I1002 13:01:52.195050 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerDied","Data":"14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4"} Oct 02 13:01:52 crc kubenswrapper[4929]: I1002 13:01:52.196486 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerStarted","Data":"acd742b52ab7a99c1ae015562d26e85b03ec3006e8667361e401c7afab4c9f62"} Oct 02 13:01:52 crc kubenswrapper[4929]: I1002 13:01:52.206389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerStarted","Data":"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260"} Oct 02 13:01:52 crc kubenswrapper[4929]: I1002 13:01:52.246472 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dvxht" podStartSLOduration=2.6962859359999998 podStartE2EDuration="7.246445894s" podCreationTimestamp="2025-10-02 13:01:45 +0000 UTC" firstStartedPulling="2025-10-02 13:01:47.122545715 +0000 UTC m=+6707.672912079" lastFinishedPulling="2025-10-02 13:01:51.672705673 +0000 UTC m=+6712.223072037" observedRunningTime="2025-10-02 13:01:52.242031956 +0000 UTC m=+6712.792398350" watchObservedRunningTime="2025-10-02 13:01:52.246445894 +0000 UTC m=+6712.796812258" Oct 02 13:01:54 crc kubenswrapper[4929]: I1002 13:01:54.226776 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerStarted","Data":"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda"} Oct 02 13:01:55 crc kubenswrapper[4929]: I1002 13:01:55.455097 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:55 crc kubenswrapper[4929]: I1002 13:01:55.455157 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:01:56 crc kubenswrapper[4929]: I1002 13:01:56.502275 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dvxht" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="registry-server" probeResult="failure" output=< Oct 02 13:01:56 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:01:56 crc kubenswrapper[4929]: > Oct 02 13:01:57 crc kubenswrapper[4929]: I1002 13:01:57.156486 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:01:57 crc kubenswrapper[4929]: E1002 13:01:57.156728 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:01:58 crc kubenswrapper[4929]: I1002 13:01:58.268609 4929 generic.go:334] "Generic (PLEG): container finished" podID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerID="0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda" exitCode=0 Oct 02 13:01:58 crc kubenswrapper[4929]: I1002 13:01:58.268665 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerDied","Data":"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda"} Oct 02 13:01:58 crc kubenswrapper[4929]: I1002 13:01:58.271534 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:01:59 crc kubenswrapper[4929]: I1002 13:01:59.282855 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerStarted","Data":"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc"} Oct 02 13:01:59 crc kubenswrapper[4929]: I1002 13:01:59.307195 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4drhw" podStartSLOduration=2.796651185 podStartE2EDuration="9.30717019s" podCreationTimestamp="2025-10-02 13:01:50 +0000 UTC" firstStartedPulling="2025-10-02 13:01:52.19750302 +0000 UTC m=+6712.747869384" lastFinishedPulling="2025-10-02 13:01:58.708022005 +0000 UTC m=+6719.258388389" observedRunningTime="2025-10-02 13:01:59.298253523 +0000 UTC m=+6719.848619887" watchObservedRunningTime="2025-10-02 13:01:59.30717019 +0000 UTC m=+6719.857536554" Oct 02 13:02:00 crc kubenswrapper[4929]: I1002 13:02:00.860758 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:00 crc kubenswrapper[4929]: I1002 13:02:00.861131 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:01 crc kubenswrapper[4929]: I1002 13:02:01.908070 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4drhw" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" probeResult="failure" output=< Oct 02 13:02:01 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:02:01 crc kubenswrapper[4929]: > Oct 02 13:02:05 crc kubenswrapper[4929]: I1002 13:02:05.503418 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:02:05 crc kubenswrapper[4929]: I1002 13:02:05.571032 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:02:05 crc kubenswrapper[4929]: I1002 13:02:05.738298 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.361093 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dvxht" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="registry-server" containerID="cri-o://3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260" gracePeriod=2 Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.870300 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.944704 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities\") pod \"71a96850-f56f-454e-9868-11e1bbe6a98a\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.944782 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngjq7\" (UniqueName: \"kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7\") pod \"71a96850-f56f-454e-9868-11e1bbe6a98a\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.944856 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content\") pod \"71a96850-f56f-454e-9868-11e1bbe6a98a\" (UID: \"71a96850-f56f-454e-9868-11e1bbe6a98a\") " Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.946402 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities" (OuterVolumeSpecName: "utilities") pod "71a96850-f56f-454e-9868-11e1bbe6a98a" (UID: "71a96850-f56f-454e-9868-11e1bbe6a98a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.954195 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7" (OuterVolumeSpecName: "kube-api-access-ngjq7") pod "71a96850-f56f-454e-9868-11e1bbe6a98a" (UID: "71a96850-f56f-454e-9868-11e1bbe6a98a"). InnerVolumeSpecName "kube-api-access-ngjq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:07 crc kubenswrapper[4929]: I1002 13:02:07.990781 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71a96850-f56f-454e-9868-11e1bbe6a98a" (UID: "71a96850-f56f-454e-9868-11e1bbe6a98a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.048260 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngjq7\" (UniqueName: \"kubernetes.io/projected/71a96850-f56f-454e-9868-11e1bbe6a98a-kube-api-access-ngjq7\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.048308 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.048319 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a96850-f56f-454e-9868-11e1bbe6a98a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.372453 4929 generic.go:334] "Generic (PLEG): container finished" podID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerID="3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260" exitCode=0 Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.372493 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerDied","Data":"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260"} Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.372518 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvxht" event={"ID":"71a96850-f56f-454e-9868-11e1bbe6a98a","Type":"ContainerDied","Data":"e11c6eed12a5f15907d8d9fc40080f17bf7ff16f9f5d581442cb5746d6e32a9b"} Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.372529 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvxht" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.372548 4929 scope.go:117] "RemoveContainer" containerID="3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.396232 4929 scope.go:117] "RemoveContainer" containerID="17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.397800 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.407151 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dvxht"] Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.437344 4929 scope.go:117] "RemoveContainer" containerID="1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.496113 4929 scope.go:117] "RemoveContainer" containerID="3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260" Oct 02 13:02:08 crc kubenswrapper[4929]: E1002 13:02:08.496624 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260\": container with ID starting with 3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260 not found: ID does not exist" containerID="3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.496665 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260"} err="failed to get container status \"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260\": rpc error: code = NotFound desc = could not find container \"3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260\": container with ID starting with 3f048fb7f2d649fd60e6781e0047084da3fb664e33f8e649323deba778106260 not found: ID does not exist" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.496693 4929 scope.go:117] "RemoveContainer" containerID="17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2" Oct 02 13:02:08 crc kubenswrapper[4929]: E1002 13:02:08.497785 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2\": container with ID starting with 17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2 not found: ID does not exist" containerID="17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.497829 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2"} err="failed to get container status \"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2\": rpc error: code = NotFound desc = could not find container \"17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2\": container with ID starting with 17344f618cc7ec36ad6be3e7aea971af1b88b7c3686ab31cfb52ad8873c95eb2 not found: ID does not exist" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.497845 4929 scope.go:117] "RemoveContainer" containerID="1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254" Oct 02 13:02:08 crc kubenswrapper[4929]: E1002 13:02:08.498521 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254\": container with ID starting with 1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254 not found: ID does not exist" containerID="1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254" Oct 02 13:02:08 crc kubenswrapper[4929]: I1002 13:02:08.498572 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254"} err="failed to get container status \"1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254\": rpc error: code = NotFound desc = could not find container \"1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254\": container with ID starting with 1a182e51867fa8ac8c4d759a535206327d92d59c2c46d9d79d94f3162eeba254 not found: ID does not exist" Oct 02 13:02:10 crc kubenswrapper[4929]: I1002 13:02:10.169719 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" path="/var/lib/kubelet/pods/71a96850-f56f-454e-9868-11e1bbe6a98a/volumes" Oct 02 13:02:11 crc kubenswrapper[4929]: I1002 13:02:11.157597 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:02:11 crc kubenswrapper[4929]: E1002 13:02:11.158182 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:02:11 crc kubenswrapper[4929]: I1002 13:02:11.908275 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4drhw" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" probeResult="failure" output=< Oct 02 13:02:11 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:02:11 crc kubenswrapper[4929]: > Oct 02 13:02:21 crc kubenswrapper[4929]: I1002 13:02:21.907263 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4drhw" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" probeResult="failure" output=< Oct 02 13:02:21 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:02:21 crc kubenswrapper[4929]: > Oct 02 13:02:25 crc kubenswrapper[4929]: I1002 13:02:25.156339 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:02:25 crc kubenswrapper[4929]: E1002 13:02:25.157088 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:02:31 crc kubenswrapper[4929]: I1002 13:02:31.915164 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4drhw" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" probeResult="failure" output=< Oct 02 13:02:31 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:02:31 crc kubenswrapper[4929]: > Oct 02 13:02:38 crc kubenswrapper[4929]: I1002 13:02:38.157188 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:02:38 crc kubenswrapper[4929]: E1002 13:02:38.157769 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:02:40 crc kubenswrapper[4929]: I1002 13:02:40.911360 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:40 crc kubenswrapper[4929]: I1002 13:02:40.961766 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:41 crc kubenswrapper[4929]: I1002 13:02:41.148548 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:02:42 crc kubenswrapper[4929]: I1002 13:02:42.778494 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4drhw" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" containerID="cri-o://3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc" gracePeriod=2 Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.561327 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.660855 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content\") pod \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.661142 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2mmx\" (UniqueName: \"kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx\") pod \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.661452 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities\") pod \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\" (UID: \"8c7b69c7-8624-4e78-a832-7d0a5d200f55\") " Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.662185 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities" (OuterVolumeSpecName: "utilities") pod "8c7b69c7-8624-4e78-a832-7d0a5d200f55" (UID: "8c7b69c7-8624-4e78-a832-7d0a5d200f55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.667659 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx" (OuterVolumeSpecName: "kube-api-access-l2mmx") pod "8c7b69c7-8624-4e78-a832-7d0a5d200f55" (UID: "8c7b69c7-8624-4e78-a832-7d0a5d200f55"). InnerVolumeSpecName "kube-api-access-l2mmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.739048 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c7b69c7-8624-4e78-a832-7d0a5d200f55" (UID: "8c7b69c7-8624-4e78-a832-7d0a5d200f55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.763864 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.763906 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c7b69c7-8624-4e78-a832-7d0a5d200f55-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.763923 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2mmx\" (UniqueName: \"kubernetes.io/projected/8c7b69c7-8624-4e78-a832-7d0a5d200f55-kube-api-access-l2mmx\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.791009 4929 generic.go:334] "Generic (PLEG): container finished" podID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerID="3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc" exitCode=0 Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.791057 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerDied","Data":"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc"} Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.791074 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drhw" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.791085 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drhw" event={"ID":"8c7b69c7-8624-4e78-a832-7d0a5d200f55","Type":"ContainerDied","Data":"acd742b52ab7a99c1ae015562d26e85b03ec3006e8667361e401c7afab4c9f62"} Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.791109 4929 scope.go:117] "RemoveContainer" containerID="3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.828277 4929 scope.go:117] "RemoveContainer" containerID="0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.848390 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.855782 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4drhw"] Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.865857 4929 scope.go:117] "RemoveContainer" containerID="14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.912601 4929 scope.go:117] "RemoveContainer" containerID="3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc" Oct 02 13:02:43 crc kubenswrapper[4929]: E1002 13:02:43.913098 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc\": container with ID starting with 3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc not found: ID does not exist" containerID="3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.913209 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc"} err="failed to get container status \"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc\": rpc error: code = NotFound desc = could not find container \"3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc\": container with ID starting with 3d26e9dda74049c6f95b0ec08c8cd6030f2c12ac3245e857b6a18dcd5a8cc7fc not found: ID does not exist" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.913437 4929 scope.go:117] "RemoveContainer" containerID="0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda" Oct 02 13:02:43 crc kubenswrapper[4929]: E1002 13:02:43.913900 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda\": container with ID starting with 0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda not found: ID does not exist" containerID="0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.913928 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda"} err="failed to get container status \"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda\": rpc error: code = NotFound desc = could not find container \"0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda\": container with ID starting with 0b495a134f0b1cf33aa5479dd0be732361ff44ecacc438e0df4a48c308cd5bda not found: ID does not exist" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.913951 4929 scope.go:117] "RemoveContainer" containerID="14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4" Oct 02 13:02:43 crc kubenswrapper[4929]: E1002 13:02:43.914284 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4\": container with ID starting with 14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4 not found: ID does not exist" containerID="14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4" Oct 02 13:02:43 crc kubenswrapper[4929]: I1002 13:02:43.914374 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4"} err="failed to get container status \"14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4\": rpc error: code = NotFound desc = could not find container \"14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4\": container with ID starting with 14645399a4fe6eadcc7a84028b30908b5f3bdd0a802c82c2cd29cc90710773f4 not found: ID does not exist" Oct 02 13:02:44 crc kubenswrapper[4929]: I1002 13:02:44.170182 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" path="/var/lib/kubelet/pods/8c7b69c7-8624-4e78-a832-7d0a5d200f55/volumes" Oct 02 13:02:49 crc kubenswrapper[4929]: I1002 13:02:49.156618 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:02:49 crc kubenswrapper[4929]: E1002 13:02:49.157426 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:01 crc kubenswrapper[4929]: I1002 13:03:01.156730 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:03:01 crc kubenswrapper[4929]: E1002 13:03:01.157556 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:14 crc kubenswrapper[4929]: I1002 13:03:14.156696 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:03:14 crc kubenswrapper[4929]: E1002 13:03:14.157541 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:29 crc kubenswrapper[4929]: I1002 13:03:29.157224 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:03:29 crc kubenswrapper[4929]: E1002 13:03:29.158115 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.163664 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.164466 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.940825 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941394 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="extract-content" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941417 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="extract-content" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941446 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="extract-utilities" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941455 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="extract-utilities" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941472 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941481 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941492 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="extract-utilities" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941499 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="extract-utilities" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941527 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941535 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: E1002 13:03:40.941547 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="extract-content" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941557 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="extract-content" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941797 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a96850-f56f-454e-9868-11e1bbe6a98a" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.941825 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7b69c7-8624-4e78-a832-7d0a5d200f55" containerName="registry-server" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.948757 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.976495 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.976703 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:40 crc kubenswrapper[4929]: I1002 13:03:40.976941 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hf97\" (UniqueName: \"kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.026583 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.080058 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hf97\" (UniqueName: \"kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.080170 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.080687 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.080850 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.081216 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.117488 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hf97\" (UniqueName: \"kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97\") pod \"community-operators-ffzns\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.273463 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:41 crc kubenswrapper[4929]: I1002 13:03:41.788652 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:42 crc kubenswrapper[4929]: I1002 13:03:42.336085 4929 generic.go:334] "Generic (PLEG): container finished" podID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerID="5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489" exitCode=0 Oct 02 13:03:42 crc kubenswrapper[4929]: I1002 13:03:42.336166 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerDied","Data":"5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489"} Oct 02 13:03:42 crc kubenswrapper[4929]: I1002 13:03:42.336389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerStarted","Data":"97dc5f02dedf9da28e6c16bc221fcc3a92873f145a9dac403a21283839a26de4"} Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.135063 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.139899 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.144754 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.144814 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stn6f\" (UniqueName: \"kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.144914 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.153270 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.247915 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.248001 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stn6f\" (UniqueName: \"kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.248155 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.249150 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.249183 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.269030 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stn6f\" (UniqueName: \"kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f\") pod \"redhat-marketplace-t6rks\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.358182 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerStarted","Data":"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae"} Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.472723 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:44 crc kubenswrapper[4929]: I1002 13:03:44.949837 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:45 crc kubenswrapper[4929]: I1002 13:03:45.370212 4929 generic.go:334] "Generic (PLEG): container finished" podID="0260a758-53a0-4da5-98e1-5520793f3afe" containerID="3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6" exitCode=0 Oct 02 13:03:45 crc kubenswrapper[4929]: I1002 13:03:45.370689 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerDied","Data":"3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6"} Oct 02 13:03:45 crc kubenswrapper[4929]: I1002 13:03:45.370785 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerStarted","Data":"776f1cca63d9c48a6cb58e9162662d3ca88c3138041be642eddae8372a03930f"} Oct 02 13:03:45 crc kubenswrapper[4929]: I1002 13:03:45.373809 4929 generic.go:334] "Generic (PLEG): container finished" podID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerID="5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae" exitCode=0 Oct 02 13:03:45 crc kubenswrapper[4929]: I1002 13:03:45.373863 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerDied","Data":"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae"} Oct 02 13:03:46 crc kubenswrapper[4929]: I1002 13:03:46.395820 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerStarted","Data":"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22"} Oct 02 13:03:46 crc kubenswrapper[4929]: I1002 13:03:46.418075 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ffzns" podStartSLOduration=2.606147174 podStartE2EDuration="6.418048359s" podCreationTimestamp="2025-10-02 13:03:40 +0000 UTC" firstStartedPulling="2025-10-02 13:03:42.339003589 +0000 UTC m=+6822.889369943" lastFinishedPulling="2025-10-02 13:03:46.150904764 +0000 UTC m=+6826.701271128" observedRunningTime="2025-10-02 13:03:46.414578469 +0000 UTC m=+6826.964944833" watchObservedRunningTime="2025-10-02 13:03:46.418048359 +0000 UTC m=+6826.968414723" Oct 02 13:03:47 crc kubenswrapper[4929]: I1002 13:03:47.407440 4929 generic.go:334] "Generic (PLEG): container finished" podID="0260a758-53a0-4da5-98e1-5520793f3afe" containerID="fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863" exitCode=0 Oct 02 13:03:47 crc kubenswrapper[4929]: I1002 13:03:47.407533 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerDied","Data":"fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863"} Oct 02 13:03:49 crc kubenswrapper[4929]: I1002 13:03:49.427022 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerStarted","Data":"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9"} Oct 02 13:03:49 crc kubenswrapper[4929]: I1002 13:03:49.456729 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t6rks" podStartSLOduration=2.411766408 podStartE2EDuration="5.456705221s" podCreationTimestamp="2025-10-02 13:03:44 +0000 UTC" firstStartedPulling="2025-10-02 13:03:45.372469561 +0000 UTC m=+6825.922835925" lastFinishedPulling="2025-10-02 13:03:48.417408374 +0000 UTC m=+6828.967774738" observedRunningTime="2025-10-02 13:03:49.449129312 +0000 UTC m=+6829.999495676" watchObservedRunningTime="2025-10-02 13:03:49.456705221 +0000 UTC m=+6830.007071585" Oct 02 13:03:51 crc kubenswrapper[4929]: I1002 13:03:51.274124 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:51 crc kubenswrapper[4929]: I1002 13:03:51.275303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:51 crc kubenswrapper[4929]: I1002 13:03:51.327318 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:51 crc kubenswrapper[4929]: I1002 13:03:51.491151 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:52 crc kubenswrapper[4929]: I1002 13:03:52.157738 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:03:52 crc kubenswrapper[4929]: E1002 13:03:52.158207 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:03:52 crc kubenswrapper[4929]: I1002 13:03:52.530458 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:53 crc kubenswrapper[4929]: I1002 13:03:53.463493 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ffzns" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="registry-server" containerID="cri-o://30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22" gracePeriod=2 Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.046779 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.197932 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hf97\" (UniqueName: \"kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97\") pod \"99daee6d-1098-42fc-8194-ff7a5f5cef73\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.198067 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content\") pod \"99daee6d-1098-42fc-8194-ff7a5f5cef73\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.198128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities\") pod \"99daee6d-1098-42fc-8194-ff7a5f5cef73\" (UID: \"99daee6d-1098-42fc-8194-ff7a5f5cef73\") " Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.199305 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities" (OuterVolumeSpecName: "utilities") pod "99daee6d-1098-42fc-8194-ff7a5f5cef73" (UID: "99daee6d-1098-42fc-8194-ff7a5f5cef73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.205128 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97" (OuterVolumeSpecName: "kube-api-access-7hf97") pod "99daee6d-1098-42fc-8194-ff7a5f5cef73" (UID: "99daee6d-1098-42fc-8194-ff7a5f5cef73"). InnerVolumeSpecName "kube-api-access-7hf97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.244127 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99daee6d-1098-42fc-8194-ff7a5f5cef73" (UID: "99daee6d-1098-42fc-8194-ff7a5f5cef73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.300415 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.300485 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99daee6d-1098-42fc-8194-ff7a5f5cef73-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.300499 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hf97\" (UniqueName: \"kubernetes.io/projected/99daee6d-1098-42fc-8194-ff7a5f5cef73-kube-api-access-7hf97\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.473078 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.473134 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.478390 4929 generic.go:334] "Generic (PLEG): container finished" podID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerID="30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22" exitCode=0 Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.478441 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerDied","Data":"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22"} Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.478491 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffzns" event={"ID":"99daee6d-1098-42fc-8194-ff7a5f5cef73","Type":"ContainerDied","Data":"97dc5f02dedf9da28e6c16bc221fcc3a92873f145a9dac403a21283839a26de4"} Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.478516 4929 scope.go:117] "RemoveContainer" containerID="30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.478770 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffzns" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.511527 4929 scope.go:117] "RemoveContainer" containerID="5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.516873 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.526600 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ffzns"] Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.540332 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.543697 4929 scope.go:117] "RemoveContainer" containerID="5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.593023 4929 scope.go:117] "RemoveContainer" containerID="30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22" Oct 02 13:03:54 crc kubenswrapper[4929]: E1002 13:03:54.593595 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22\": container with ID starting with 30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22 not found: ID does not exist" containerID="30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.593651 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22"} err="failed to get container status \"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22\": rpc error: code = NotFound desc = could not find container \"30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22\": container with ID starting with 30d963ff8b763dcd6d952ffc28bba0848592e34604a0a29333715051a2e0eb22 not found: ID does not exist" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.593688 4929 scope.go:117] "RemoveContainer" containerID="5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae" Oct 02 13:03:54 crc kubenswrapper[4929]: E1002 13:03:54.594395 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae\": container with ID starting with 5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae not found: ID does not exist" containerID="5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.594440 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae"} err="failed to get container status \"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae\": rpc error: code = NotFound desc = could not find container \"5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae\": container with ID starting with 5eee39a34840c8180680f639a78f38416aa4194e7bd56199eacea8fffdde1aae not found: ID does not exist" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.594470 4929 scope.go:117] "RemoveContainer" containerID="5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489" Oct 02 13:03:54 crc kubenswrapper[4929]: E1002 13:03:54.594880 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489\": container with ID starting with 5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489 not found: ID does not exist" containerID="5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489" Oct 02 13:03:54 crc kubenswrapper[4929]: I1002 13:03:54.594939 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489"} err="failed to get container status \"5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489\": rpc error: code = NotFound desc = could not find container \"5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489\": container with ID starting with 5e66a49f2dbfac880030f70cdd15bcc7043f7be6d4bf18550ec3764bbc829489 not found: ID does not exist" Oct 02 13:03:55 crc kubenswrapper[4929]: I1002 13:03:55.546137 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:56 crc kubenswrapper[4929]: I1002 13:03:56.171526 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" path="/var/lib/kubelet/pods/99daee6d-1098-42fc-8194-ff7a5f5cef73/volumes" Oct 02 13:03:56 crc kubenswrapper[4929]: I1002 13:03:56.927553 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:57 crc kubenswrapper[4929]: I1002 13:03:57.504253 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t6rks" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="registry-server" containerID="cri-o://c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9" gracePeriod=2 Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.004657 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.115480 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities\") pod \"0260a758-53a0-4da5-98e1-5520793f3afe\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.115621 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content\") pod \"0260a758-53a0-4da5-98e1-5520793f3afe\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.115690 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stn6f\" (UniqueName: \"kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f\") pod \"0260a758-53a0-4da5-98e1-5520793f3afe\" (UID: \"0260a758-53a0-4da5-98e1-5520793f3afe\") " Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.116345 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities" (OuterVolumeSpecName: "utilities") pod "0260a758-53a0-4da5-98e1-5520793f3afe" (UID: "0260a758-53a0-4da5-98e1-5520793f3afe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.124665 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f" (OuterVolumeSpecName: "kube-api-access-stn6f") pod "0260a758-53a0-4da5-98e1-5520793f3afe" (UID: "0260a758-53a0-4da5-98e1-5520793f3afe"). InnerVolumeSpecName "kube-api-access-stn6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.131164 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0260a758-53a0-4da5-98e1-5520793f3afe" (UID: "0260a758-53a0-4da5-98e1-5520793f3afe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.219046 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.219203 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0260a758-53a0-4da5-98e1-5520793f3afe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.219221 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stn6f\" (UniqueName: \"kubernetes.io/projected/0260a758-53a0-4da5-98e1-5520793f3afe-kube-api-access-stn6f\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.516835 4929 generic.go:334] "Generic (PLEG): container finished" podID="0260a758-53a0-4da5-98e1-5520793f3afe" containerID="c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9" exitCode=0 Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.516877 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerDied","Data":"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9"} Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.516903 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6rks" event={"ID":"0260a758-53a0-4da5-98e1-5520793f3afe","Type":"ContainerDied","Data":"776f1cca63d9c48a6cb58e9162662d3ca88c3138041be642eddae8372a03930f"} Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.516919 4929 scope.go:117] "RemoveContainer" containerID="c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.517057 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6rks" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.539406 4929 scope.go:117] "RemoveContainer" containerID="fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.543742 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.554861 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6rks"] Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.566567 4929 scope.go:117] "RemoveContainer" containerID="3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.614074 4929 scope.go:117] "RemoveContainer" containerID="c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9" Oct 02 13:03:58 crc kubenswrapper[4929]: E1002 13:03:58.619507 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9\": container with ID starting with c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9 not found: ID does not exist" containerID="c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.619583 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9"} err="failed to get container status \"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9\": rpc error: code = NotFound desc = could not find container \"c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9\": container with ID starting with c2a81eb1e476971223dfe3d945567b4e678a962597b52bf7853bb2ea934505f9 not found: ID does not exist" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.619615 4929 scope.go:117] "RemoveContainer" containerID="fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863" Oct 02 13:03:58 crc kubenswrapper[4929]: E1002 13:03:58.620517 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863\": container with ID starting with fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863 not found: ID does not exist" containerID="fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.620543 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863"} err="failed to get container status \"fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863\": rpc error: code = NotFound desc = could not find container \"fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863\": container with ID starting with fa23b0fb9f83d14b26421a0ac377d10c59760ce6b4b439030acb030be5498863 not found: ID does not exist" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.620558 4929 scope.go:117] "RemoveContainer" containerID="3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6" Oct 02 13:03:58 crc kubenswrapper[4929]: E1002 13:03:58.621219 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6\": container with ID starting with 3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6 not found: ID does not exist" containerID="3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6" Oct 02 13:03:58 crc kubenswrapper[4929]: I1002 13:03:58.621290 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6"} err="failed to get container status \"3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6\": rpc error: code = NotFound desc = could not find container \"3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6\": container with ID starting with 3b50d78e70a887c9edbb21a34f86a2d8ff5742a777fb04d106e41cb20bdf28b6 not found: ID does not exist" Oct 02 13:04:00 crc kubenswrapper[4929]: I1002 13:04:00.033555 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-ndzrx"] Oct 02 13:04:00 crc kubenswrapper[4929]: I1002 13:04:00.041764 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-ndzrx"] Oct 02 13:04:00 crc kubenswrapper[4929]: I1002 13:04:00.174767 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" path="/var/lib/kubelet/pods/0260a758-53a0-4da5-98e1-5520793f3afe/volumes" Oct 02 13:04:00 crc kubenswrapper[4929]: I1002 13:04:00.175975 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a" path="/var/lib/kubelet/pods/6b6944d9-7fd8-4c8b-8f12-1a7a9b79fb2a/volumes" Oct 02 13:04:04 crc kubenswrapper[4929]: I1002 13:04:04.157225 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:04:04 crc kubenswrapper[4929]: E1002 13:04:04.159184 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:04:10 crc kubenswrapper[4929]: I1002 13:04:10.030666 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-5fca-account-create-2gv4t"] Oct 02 13:04:10 crc kubenswrapper[4929]: I1002 13:04:10.041043 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-5fca-account-create-2gv4t"] Oct 02 13:04:10 crc kubenswrapper[4929]: I1002 13:04:10.169138 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd379653-5579-4af4-b401-756c91cd6f66" path="/var/lib/kubelet/pods/fd379653-5579-4af4-b401-756c91cd6f66/volumes" Oct 02 13:04:16 crc kubenswrapper[4929]: I1002 13:04:16.158815 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:04:16 crc kubenswrapper[4929]: E1002 13:04:16.160352 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:04:24 crc kubenswrapper[4929]: I1002 13:04:24.039093 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wdlvs"] Oct 02 13:04:24 crc kubenswrapper[4929]: I1002 13:04:24.049678 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wdlvs"] Oct 02 13:04:24 crc kubenswrapper[4929]: I1002 13:04:24.172564 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2729fe-14bc-44f9-bedb-e9745024b9d9" path="/var/lib/kubelet/pods/2f2729fe-14bc-44f9-bedb-e9745024b9d9/volumes" Oct 02 13:04:27 crc kubenswrapper[4929]: I1002 13:04:27.443201 4929 scope.go:117] "RemoveContainer" containerID="a89b35fae137a9efc2f21e631e6e6762983d9e3df8442f825611d841f7745c6b" Oct 02 13:04:27 crc kubenswrapper[4929]: I1002 13:04:27.473375 4929 scope.go:117] "RemoveContainer" containerID="d5124b2d6d0bef2eac6f4bf796108c5a0564dd4e56ba73c65ba1740ea9cc8157" Oct 02 13:04:27 crc kubenswrapper[4929]: I1002 13:04:27.522603 4929 scope.go:117] "RemoveContainer" containerID="8f116a5465a78365b06baefe321fbfd3c163be569eab3595a3630362510dd6c8" Oct 02 13:04:31 crc kubenswrapper[4929]: I1002 13:04:31.157098 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:04:31 crc kubenswrapper[4929]: E1002 13:04:31.158179 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:04:42 crc kubenswrapper[4929]: I1002 13:04:42.157306 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:04:42 crc kubenswrapper[4929]: E1002 13:04:42.158069 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:04:56 crc kubenswrapper[4929]: I1002 13:04:56.156677 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:04:56 crc kubenswrapper[4929]: E1002 13:04:56.157487 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:05:07 crc kubenswrapper[4929]: I1002 13:05:07.157685 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:05:07 crc kubenswrapper[4929]: E1002 13:05:07.158480 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:05:18 crc kubenswrapper[4929]: I1002 13:05:18.157409 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:05:18 crc kubenswrapper[4929]: E1002 13:05:18.158211 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:05:27 crc kubenswrapper[4929]: I1002 13:05:27.681508 4929 scope.go:117] "RemoveContainer" containerID="d63a3e91bf7554c418d3e8100f29186d4a14bd6971482b350b4be0ceeae99915" Oct 02 13:05:27 crc kubenswrapper[4929]: I1002 13:05:27.707228 4929 scope.go:117] "RemoveContainer" containerID="53ac1e449cea77d54708f0eb9eed0a8e68188e5152a7018108a627b58c4c5b2d" Oct 02 13:05:30 crc kubenswrapper[4929]: I1002 13:05:30.164036 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:05:30 crc kubenswrapper[4929]: E1002 13:05:30.164938 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:05:42 crc kubenswrapper[4929]: I1002 13:05:42.157556 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:05:42 crc kubenswrapper[4929]: E1002 13:05:42.158422 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:05:54 crc kubenswrapper[4929]: I1002 13:05:54.157381 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:05:54 crc kubenswrapper[4929]: E1002 13:05:54.158300 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:06:06 crc kubenswrapper[4929]: I1002 13:06:06.157027 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:06:06 crc kubenswrapper[4929]: E1002 13:06:06.157838 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:06:21 crc kubenswrapper[4929]: I1002 13:06:21.156699 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:06:21 crc kubenswrapper[4929]: E1002 13:06:21.157403 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:06:24 crc kubenswrapper[4929]: I1002 13:06:24.050398 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-4brnk"] Oct 02 13:06:24 crc kubenswrapper[4929]: I1002 13:06:24.064201 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-4brnk"] Oct 02 13:06:24 crc kubenswrapper[4929]: I1002 13:06:24.167989 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39c7e50-59e1-4392-a742-3c11054e25d8" path="/var/lib/kubelet/pods/c39c7e50-59e1-4392-a742-3c11054e25d8/volumes" Oct 02 13:06:27 crc kubenswrapper[4929]: I1002 13:06:27.781471 4929 scope.go:117] "RemoveContainer" containerID="0bede43362ffd71d0457181b201c19d8c79fba4f503067bfb97cc75be4c5fbce" Oct 02 13:06:34 crc kubenswrapper[4929]: I1002 13:06:34.037975 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-7a7b-account-create-r4q7k"] Oct 02 13:06:34 crc kubenswrapper[4929]: I1002 13:06:34.047203 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-7a7b-account-create-r4q7k"] Oct 02 13:06:34 crc kubenswrapper[4929]: I1002 13:06:34.169848 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb557e3e-048d-4b3e-bcf9-aa827b11e02d" path="/var/lib/kubelet/pods/bb557e3e-048d-4b3e-bcf9-aa827b11e02d/volumes" Oct 02 13:06:36 crc kubenswrapper[4929]: I1002 13:06:36.157310 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:06:36 crc kubenswrapper[4929]: E1002 13:06:36.157841 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:06:46 crc kubenswrapper[4929]: I1002 13:06:46.041794 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-v26ms"] Oct 02 13:06:46 crc kubenswrapper[4929]: I1002 13:06:46.052789 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-v26ms"] Oct 02 13:06:46 crc kubenswrapper[4929]: I1002 13:06:46.178831 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e0d329-c0ec-4e95-a696-ae33e2eda9dc" path="/var/lib/kubelet/pods/95e0d329-c0ec-4e95-a696-ae33e2eda9dc/volumes" Oct 02 13:06:48 crc kubenswrapper[4929]: I1002 13:06:48.156644 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:06:49 crc kubenswrapper[4929]: I1002 13:06:49.264005 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25"} Oct 02 13:07:09 crc kubenswrapper[4929]: I1002 13:07:09.039984 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-frhnh"] Oct 02 13:07:09 crc kubenswrapper[4929]: I1002 13:07:09.048633 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-frhnh"] Oct 02 13:07:10 crc kubenswrapper[4929]: I1002 13:07:10.169114 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd09510-c570-4fdc-afe4-206212f65c8e" path="/var/lib/kubelet/pods/3fd09510-c570-4fdc-afe4-206212f65c8e/volumes" Oct 02 13:07:19 crc kubenswrapper[4929]: I1002 13:07:19.026425 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-5b16-account-create-c2fzx"] Oct 02 13:07:19 crc kubenswrapper[4929]: I1002 13:07:19.036686 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-5b16-account-create-c2fzx"] Oct 02 13:07:20 crc kubenswrapper[4929]: I1002 13:07:20.167846 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565fce37-3c88-4527-aa53-688c6e3ab5c9" path="/var/lib/kubelet/pods/565fce37-3c88-4527-aa53-688c6e3ab5c9/volumes" Oct 02 13:07:27 crc kubenswrapper[4929]: I1002 13:07:27.845717 4929 scope.go:117] "RemoveContainer" containerID="f55eed708757f98f617dff476d4efaf1488663c370ff34c4cfe7d999e425c065" Oct 02 13:07:27 crc kubenswrapper[4929]: I1002 13:07:27.871685 4929 scope.go:117] "RemoveContainer" containerID="f81535a71e3507564f7bdf50047ede6bd67286802813789207cff50b9155b61c" Oct 02 13:07:27 crc kubenswrapper[4929]: I1002 13:07:27.934886 4929 scope.go:117] "RemoveContainer" containerID="a389bdc5843f4c36061cae2c2fa34c7feaae23cb830cc360f5465e2d6a4fa66f" Oct 02 13:07:27 crc kubenswrapper[4929]: I1002 13:07:27.988838 4929 scope.go:117] "RemoveContainer" containerID="dd215e6d27b257cea282c8b65069f1a8e81d19255c6137fe2f20a101c7929878" Oct 02 13:07:30 crc kubenswrapper[4929]: I1002 13:07:30.040786 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-q5nx8"] Oct 02 13:07:30 crc kubenswrapper[4929]: I1002 13:07:30.051806 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-q5nx8"] Oct 02 13:07:30 crc kubenswrapper[4929]: I1002 13:07:30.170265 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e165862b-1cb5-4cfb-934f-33abd112b63f" path="/var/lib/kubelet/pods/e165862b-1cb5-4cfb-934f-33abd112b63f/volumes" Oct 02 13:08:28 crc kubenswrapper[4929]: I1002 13:08:28.099632 4929 scope.go:117] "RemoveContainer" containerID="8492690d54bd882581ed4c6871094cfa65f767d4c84a2e2197a983dff379cafe" Oct 02 13:09:14 crc kubenswrapper[4929]: I1002 13:09:14.736697 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:09:14 crc kubenswrapper[4929]: I1002 13:09:14.737231 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:09:44 crc kubenswrapper[4929]: I1002 13:09:44.740836 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:09:44 crc kubenswrapper[4929]: I1002 13:09:44.741878 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:10:14 crc kubenswrapper[4929]: I1002 13:10:14.736457 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:10:14 crc kubenswrapper[4929]: I1002 13:10:14.737067 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:10:14 crc kubenswrapper[4929]: I1002 13:10:14.737126 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:10:14 crc kubenswrapper[4929]: I1002 13:10:14.737946 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:10:14 crc kubenswrapper[4929]: I1002 13:10:14.738044 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25" gracePeriod=600 Oct 02 13:10:15 crc kubenswrapper[4929]: I1002 13:10:15.317207 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25" exitCode=0 Oct 02 13:10:15 crc kubenswrapper[4929]: I1002 13:10:15.317312 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25"} Oct 02 13:10:15 crc kubenswrapper[4929]: I1002 13:10:15.317591 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2"} Oct 02 13:10:15 crc kubenswrapper[4929]: I1002 13:10:15.317619 4929 scope.go:117] "RemoveContainer" containerID="f73155b4eb29e80b068f07d2d72745f50b35df0c2c0c7c17f5ae3a4a26122ae9" Oct 02 13:10:22 crc kubenswrapper[4929]: I1002 13:10:22.382377 4929 generic.go:334] "Generic (PLEG): container finished" podID="4bff17f1-9395-47c4-a706-a0adf7745c50" containerID="bb0bbbbcfd0261fb8a50b2809de5daf571ac3064e72a02e58c7fe7f69dd72ff7" exitCode=0 Oct 02 13:10:22 crc kubenswrapper[4929]: I1002 13:10:22.382478 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" event={"ID":"4bff17f1-9395-47c4-a706-a0adf7745c50","Type":"ContainerDied","Data":"bb0bbbbcfd0261fb8a50b2809de5daf571ac3064e72a02e58c7fe7f69dd72ff7"} Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.860466 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.868894 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory\") pod \"4bff17f1-9395-47c4-a706-a0adf7745c50\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.868981 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph\") pod \"4bff17f1-9395-47c4-a706-a0adf7745c50\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.869490 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k644\" (UniqueName: \"kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644\") pod \"4bff17f1-9395-47c4-a706-a0adf7745c50\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.869622 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle\") pod \"4bff17f1-9395-47c4-a706-a0adf7745c50\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.869835 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key\") pod \"4bff17f1-9395-47c4-a706-a0adf7745c50\" (UID: \"4bff17f1-9395-47c4-a706-a0adf7745c50\") " Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.880317 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644" (OuterVolumeSpecName: "kube-api-access-8k644") pod "4bff17f1-9395-47c4-a706-a0adf7745c50" (UID: "4bff17f1-9395-47c4-a706-a0adf7745c50"). InnerVolumeSpecName "kube-api-access-8k644". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.881284 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "4bff17f1-9395-47c4-a706-a0adf7745c50" (UID: "4bff17f1-9395-47c4-a706-a0adf7745c50"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.888183 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph" (OuterVolumeSpecName: "ceph") pod "4bff17f1-9395-47c4-a706-a0adf7745c50" (UID: "4bff17f1-9395-47c4-a706-a0adf7745c50"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.912685 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4bff17f1-9395-47c4-a706-a0adf7745c50" (UID: "4bff17f1-9395-47c4-a706-a0adf7745c50"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.926148 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory" (OuterVolumeSpecName: "inventory") pod "4bff17f1-9395-47c4-a706-a0adf7745c50" (UID: "4bff17f1-9395-47c4-a706-a0adf7745c50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.972461 4929 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.972512 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.972550 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.972563 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4bff17f1-9395-47c4-a706-a0adf7745c50-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:23 crc kubenswrapper[4929]: I1002 13:10:23.972578 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k644\" (UniqueName: \"kubernetes.io/projected/4bff17f1-9395-47c4-a706-a0adf7745c50-kube-api-access-8k644\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:24 crc kubenswrapper[4929]: I1002 13:10:24.406219 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" event={"ID":"4bff17f1-9395-47c4-a706-a0adf7745c50","Type":"ContainerDied","Data":"806664fab10e9d4d8e471a1b215c061de5d1c7f6c206f5f7ca6f62a8a08bab3f"} Oct 02 13:10:24 crc kubenswrapper[4929]: I1002 13:10:24.406549 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806664fab10e9d4d8e471a1b215c061de5d1c7f6c206f5f7ca6f62a8a08bab3f" Oct 02 13:10:24 crc kubenswrapper[4929]: I1002 13:10:24.406283 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.301278 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6pmtz"] Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302132 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302146 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302173 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="extract-utilities" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302180 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="extract-utilities" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302193 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="extract-content" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302200 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="extract-content" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302222 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="extract-content" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302227 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="extract-content" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302248 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bff17f1-9395-47c4-a706-a0adf7745c50" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302256 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bff17f1-9395-47c4-a706-a0adf7745c50" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302269 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302275 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: E1002 13:10:31.302286 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="extract-utilities" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302292 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="extract-utilities" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302485 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="99daee6d-1098-42fc-8194-ff7a5f5cef73" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302501 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0260a758-53a0-4da5-98e1-5520793f3afe" containerName="registry-server" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.302515 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bff17f1-9395-47c4-a706-a0adf7745c50" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.303249 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.308163 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.308457 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.308519 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.308779 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.321876 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6pmtz"] Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.349473 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.349701 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k657h\" (UniqueName: \"kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.349841 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.349912 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.350068 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.465080 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.465228 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.465903 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.467759 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k657h\" (UniqueName: \"kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.467867 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.473571 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.473737 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.482518 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.483108 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.487023 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k657h\" (UniqueName: \"kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h\") pod \"bootstrap-openstack-openstack-cell1-6pmtz\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:31 crc kubenswrapper[4929]: I1002 13:10:31.633288 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:10:32 crc kubenswrapper[4929]: I1002 13:10:32.189933 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6pmtz"] Oct 02 13:10:32 crc kubenswrapper[4929]: I1002 13:10:32.199666 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:10:32 crc kubenswrapper[4929]: I1002 13:10:32.505448 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" event={"ID":"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8","Type":"ContainerStarted","Data":"6d39f17b562b3f1683268ed7a62eaf58a97c48c5c3ed8a4bd8ab02a148d03d4a"} Oct 02 13:10:33 crc kubenswrapper[4929]: I1002 13:10:33.516097 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" event={"ID":"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8","Type":"ContainerStarted","Data":"042a143c11ff09c13da0ed6b3a1d1925a0f65d98d81bc8c32e3c1efde8a558fa"} Oct 02 13:10:33 crc kubenswrapper[4929]: I1002 13:10:33.533281 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" podStartSLOduration=2.013859649 podStartE2EDuration="2.533261634s" podCreationTimestamp="2025-10-02 13:10:31 +0000 UTC" firstStartedPulling="2025-10-02 13:10:32.199457488 +0000 UTC m=+7232.749823852" lastFinishedPulling="2025-10-02 13:10:32.718859473 +0000 UTC m=+7233.269225837" observedRunningTime="2025-10-02 13:10:33.530275528 +0000 UTC m=+7234.080641892" watchObservedRunningTime="2025-10-02 13:10:33.533261634 +0000 UTC m=+7234.083627998" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.692097 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.694791 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.711530 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.811124 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9mf4\" (UniqueName: \"kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.811751 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.812155 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.914428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9mf4\" (UniqueName: \"kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.914521 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.914607 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.915063 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.915118 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:23 crc kubenswrapper[4929]: I1002 13:12:23.934086 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9mf4\" (UniqueName: \"kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4\") pod \"redhat-operators-xwv84\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:24 crc kubenswrapper[4929]: I1002 13:12:24.026787 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:24 crc kubenswrapper[4929]: I1002 13:12:24.506141 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:24 crc kubenswrapper[4929]: I1002 13:12:24.602616 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerStarted","Data":"74a2d791e39577555d3d155ceb1a38b8866e0e6f2780f1d313ef4ded072842c1"} Oct 02 13:12:25 crc kubenswrapper[4929]: I1002 13:12:25.612905 4929 generic.go:334] "Generic (PLEG): container finished" podID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerID="d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7" exitCode=0 Oct 02 13:12:25 crc kubenswrapper[4929]: I1002 13:12:25.612986 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerDied","Data":"d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7"} Oct 02 13:12:27 crc kubenswrapper[4929]: I1002 13:12:27.633670 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerStarted","Data":"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4"} Oct 02 13:12:31 crc kubenswrapper[4929]: I1002 13:12:31.674941 4929 generic.go:334] "Generic (PLEG): container finished" podID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerID="793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4" exitCode=0 Oct 02 13:12:31 crc kubenswrapper[4929]: I1002 13:12:31.674999 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerDied","Data":"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4"} Oct 02 13:12:32 crc kubenswrapper[4929]: I1002 13:12:32.688708 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerStarted","Data":"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166"} Oct 02 13:12:32 crc kubenswrapper[4929]: I1002 13:12:32.715717 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xwv84" podStartSLOduration=3.225847632 podStartE2EDuration="9.715698968s" podCreationTimestamp="2025-10-02 13:12:23 +0000 UTC" firstStartedPulling="2025-10-02 13:12:25.616888234 +0000 UTC m=+7346.167254598" lastFinishedPulling="2025-10-02 13:12:32.10673957 +0000 UTC m=+7352.657105934" observedRunningTime="2025-10-02 13:12:32.707281535 +0000 UTC m=+7353.257647909" watchObservedRunningTime="2025-10-02 13:12:32.715698968 +0000 UTC m=+7353.266065332" Oct 02 13:12:34 crc kubenswrapper[4929]: I1002 13:12:34.027498 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:34 crc kubenswrapper[4929]: I1002 13:12:34.027783 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:35 crc kubenswrapper[4929]: I1002 13:12:35.087225 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwv84" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" probeResult="failure" output=< Oct 02 13:12:35 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:12:35 crc kubenswrapper[4929]: > Oct 02 13:12:44 crc kubenswrapper[4929]: I1002 13:12:44.736896 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:12:44 crc kubenswrapper[4929]: I1002 13:12:44.737468 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:12:45 crc kubenswrapper[4929]: I1002 13:12:45.078713 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xwv84" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" probeResult="failure" output=< Oct 02 13:12:45 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:12:45 crc kubenswrapper[4929]: > Oct 02 13:12:54 crc kubenswrapper[4929]: I1002 13:12:54.088493 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:54 crc kubenswrapper[4929]: I1002 13:12:54.139303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:54 crc kubenswrapper[4929]: I1002 13:12:54.883176 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:55 crc kubenswrapper[4929]: I1002 13:12:55.921180 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xwv84" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" containerID="cri-o://22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166" gracePeriod=2 Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.464808 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.577032 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9mf4\" (UniqueName: \"kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4\") pod \"55d8d1fa-08ec-4421-bf44-7000f9523235\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.577135 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content\") pod \"55d8d1fa-08ec-4421-bf44-7000f9523235\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.577501 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities\") pod \"55d8d1fa-08ec-4421-bf44-7000f9523235\" (UID: \"55d8d1fa-08ec-4421-bf44-7000f9523235\") " Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.578268 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities" (OuterVolumeSpecName: "utilities") pod "55d8d1fa-08ec-4421-bf44-7000f9523235" (UID: "55d8d1fa-08ec-4421-bf44-7000f9523235"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.614934 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4" (OuterVolumeSpecName: "kube-api-access-s9mf4") pod "55d8d1fa-08ec-4421-bf44-7000f9523235" (UID: "55d8d1fa-08ec-4421-bf44-7000f9523235"). InnerVolumeSpecName "kube-api-access-s9mf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.680258 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9mf4\" (UniqueName: \"kubernetes.io/projected/55d8d1fa-08ec-4421-bf44-7000f9523235-kube-api-access-s9mf4\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.680295 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.686027 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55d8d1fa-08ec-4421-bf44-7000f9523235" (UID: "55d8d1fa-08ec-4421-bf44-7000f9523235"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.782166 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d8d1fa-08ec-4421-bf44-7000f9523235-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.930676 4929 generic.go:334] "Generic (PLEG): container finished" podID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerID="22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166" exitCode=0 Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.930732 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xwv84" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.930731 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerDied","Data":"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166"} Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.930858 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xwv84" event={"ID":"55d8d1fa-08ec-4421-bf44-7000f9523235","Type":"ContainerDied","Data":"74a2d791e39577555d3d155ceb1a38b8866e0e6f2780f1d313ef4ded072842c1"} Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.930880 4929 scope.go:117] "RemoveContainer" containerID="22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.959760 4929 scope.go:117] "RemoveContainer" containerID="793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4" Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.967248 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:56 crc kubenswrapper[4929]: I1002 13:12:56.976392 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xwv84"] Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.001883 4929 scope.go:117] "RemoveContainer" containerID="d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.036229 4929 scope.go:117] "RemoveContainer" containerID="22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166" Oct 02 13:12:57 crc kubenswrapper[4929]: E1002 13:12:57.037913 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166\": container with ID starting with 22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166 not found: ID does not exist" containerID="22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.037975 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166"} err="failed to get container status \"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166\": rpc error: code = NotFound desc = could not find container \"22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166\": container with ID starting with 22e91d4000eee8286cf06e03fac34778ea25304b11d0aed759eb496abdcdf166 not found: ID does not exist" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.038008 4929 scope.go:117] "RemoveContainer" containerID="793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4" Oct 02 13:12:57 crc kubenswrapper[4929]: E1002 13:12:57.039149 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4\": container with ID starting with 793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4 not found: ID does not exist" containerID="793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.039178 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4"} err="failed to get container status \"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4\": rpc error: code = NotFound desc = could not find container \"793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4\": container with ID starting with 793a7683b623bc154c8bc06abc396d131bd7a97530af8a44dd98b643ce296bf4 not found: ID does not exist" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.039197 4929 scope.go:117] "RemoveContainer" containerID="d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7" Oct 02 13:12:57 crc kubenswrapper[4929]: E1002 13:12:57.044000 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7\": container with ID starting with d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7 not found: ID does not exist" containerID="d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7" Oct 02 13:12:57 crc kubenswrapper[4929]: I1002 13:12:57.044033 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7"} err="failed to get container status \"d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7\": rpc error: code = NotFound desc = could not find container \"d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7\": container with ID starting with d6fc4c99991acc7b2d38e2eef2753b574891d1f04e131dcaf32d3c1a4c1839c7 not found: ID does not exist" Oct 02 13:12:58 crc kubenswrapper[4929]: I1002 13:12:58.188109 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" path="/var/lib/kubelet/pods/55d8d1fa-08ec-4421-bf44-7000f9523235/volumes" Oct 02 13:13:14 crc kubenswrapper[4929]: I1002 13:13:14.736636 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:13:14 crc kubenswrapper[4929]: I1002 13:13:14.737709 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:13:37 crc kubenswrapper[4929]: I1002 13:13:37.322010 4929 generic.go:334] "Generic (PLEG): container finished" podID="e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" containerID="042a143c11ff09c13da0ed6b3a1d1925a0f65d98d81bc8c32e3c1efde8a558fa" exitCode=0 Oct 02 13:13:37 crc kubenswrapper[4929]: I1002 13:13:37.322108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" event={"ID":"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8","Type":"ContainerDied","Data":"042a143c11ff09c13da0ed6b3a1d1925a0f65d98d81bc8c32e3c1efde8a558fa"} Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.810896 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.923048 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key\") pod \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.923128 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle\") pod \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.923155 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory\") pod \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.923222 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k657h\" (UniqueName: \"kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h\") pod \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.923313 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph\") pod \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\" (UID: \"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8\") " Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.929465 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph" (OuterVolumeSpecName: "ceph") pod "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" (UID: "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.929853 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h" (OuterVolumeSpecName: "kube-api-access-k657h") pod "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" (UID: "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8"). InnerVolumeSpecName "kube-api-access-k657h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.930137 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" (UID: "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.957649 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory" (OuterVolumeSpecName: "inventory") pod "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" (UID: "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:38 crc kubenswrapper[4929]: I1002 13:13:38.959169 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" (UID: "e3cdc9c0-5df2-4454-a0d3-4e041bca04e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.026097 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.026135 4929 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.026152 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.026164 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k657h\" (UniqueName: \"kubernetes.io/projected/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-kube-api-access-k657h\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.026175 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e3cdc9c0-5df2-4454-a0d3-4e041bca04e8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.354369 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" event={"ID":"e3cdc9c0-5df2-4454-a0d3-4e041bca04e8","Type":"ContainerDied","Data":"6d39f17b562b3f1683268ed7a62eaf58a97c48c5c3ed8a4bd8ab02a148d03d4a"} Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.354628 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d39f17b562b3f1683268ed7a62eaf58a97c48c5c3ed8a4bd8ab02a148d03d4a" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.354431 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6pmtz" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.449280 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-dvjbj"] Oct 02 13:13:39 crc kubenswrapper[4929]: E1002 13:13:39.449757 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" containerName="bootstrap-openstack-openstack-cell1" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.449776 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" containerName="bootstrap-openstack-openstack-cell1" Oct 02 13:13:39 crc kubenswrapper[4929]: E1002 13:13:39.449791 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.449797 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" Oct 02 13:13:39 crc kubenswrapper[4929]: E1002 13:13:39.449834 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="extract-utilities" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.449847 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="extract-utilities" Oct 02 13:13:39 crc kubenswrapper[4929]: E1002 13:13:39.449872 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="extract-content" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.449879 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="extract-content" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.450113 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d8d1fa-08ec-4421-bf44-7000f9523235" containerName="registry-server" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.450133 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3cdc9c0-5df2-4454-a0d3-4e041bca04e8" containerName="bootstrap-openstack-openstack-cell1" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.451240 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.453128 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.453725 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.454091 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.454986 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.471952 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-dvjbj"] Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.535611 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.535657 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.535721 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxch\" (UniqueName: \"kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.535811 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.637725 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.637770 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.637830 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxch\" (UniqueName: \"kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.637888 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.642409 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.642997 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.650632 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.653783 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxch\" (UniqueName: \"kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch\") pod \"download-cache-openstack-openstack-cell1-dvjbj\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:39 crc kubenswrapper[4929]: I1002 13:13:39.767655 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.308766 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-dvjbj"] Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.365300 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" event={"ID":"c748b552-6dd6-4df2-934c-651c8c00add9","Type":"ContainerStarted","Data":"07e4d74f59a658a034b2dd514e4e0836569bd77cbc77be94fe277f671b36e21d"} Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.735540 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.738628 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.752371 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.867518 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.867678 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.868033 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkvw\" (UniqueName: \"kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.970439 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.970610 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkvw\" (UniqueName: \"kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.970895 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.971125 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.971168 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:40 crc kubenswrapper[4929]: I1002 13:13:40.989302 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkvw\" (UniqueName: \"kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw\") pod \"community-operators-gs8l7\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:41 crc kubenswrapper[4929]: I1002 13:13:41.072304 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:41 crc kubenswrapper[4929]: I1002 13:13:41.377833 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" event={"ID":"c748b552-6dd6-4df2-934c-651c8c00add9","Type":"ContainerStarted","Data":"288671253e835a8e655256c4bb30d003ea17791ad141734c145b8c3b39cdf826"} Oct 02 13:13:41 crc kubenswrapper[4929]: I1002 13:13:41.403003 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" podStartSLOduration=1.918727597 podStartE2EDuration="2.40298261s" podCreationTimestamp="2025-10-02 13:13:39 +0000 UTC" firstStartedPulling="2025-10-02 13:13:40.323411289 +0000 UTC m=+7420.873777643" lastFinishedPulling="2025-10-02 13:13:40.807666292 +0000 UTC m=+7421.358032656" observedRunningTime="2025-10-02 13:13:41.397547822 +0000 UTC m=+7421.947914176" watchObservedRunningTime="2025-10-02 13:13:41.40298261 +0000 UTC m=+7421.953348994" Oct 02 13:13:41 crc kubenswrapper[4929]: I1002 13:13:41.642337 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:41 crc kubenswrapper[4929]: W1002 13:13:41.646650 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58030b7a_d369_4087_b812_1b203bda8e32.slice/crio-bd2c1b9fcfbb6b5354ee097dd15338467e6ca0c92ad2e63f62c2d29e08cee32d WatchSource:0}: Error finding container bd2c1b9fcfbb6b5354ee097dd15338467e6ca0c92ad2e63f62c2d29e08cee32d: Status 404 returned error can't find the container with id bd2c1b9fcfbb6b5354ee097dd15338467e6ca0c92ad2e63f62c2d29e08cee32d Oct 02 13:13:42 crc kubenswrapper[4929]: I1002 13:13:42.389667 4929 generic.go:334] "Generic (PLEG): container finished" podID="58030b7a-d369-4087-b812-1b203bda8e32" containerID="59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867" exitCode=0 Oct 02 13:13:42 crc kubenswrapper[4929]: I1002 13:13:42.389722 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerDied","Data":"59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867"} Oct 02 13:13:42 crc kubenswrapper[4929]: I1002 13:13:42.390018 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerStarted","Data":"bd2c1b9fcfbb6b5354ee097dd15338467e6ca0c92ad2e63f62c2d29e08cee32d"} Oct 02 13:13:43 crc kubenswrapper[4929]: I1002 13:13:43.401303 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerStarted","Data":"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df"} Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.411516 4929 generic.go:334] "Generic (PLEG): container finished" podID="58030b7a-d369-4087-b812-1b203bda8e32" containerID="1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df" exitCode=0 Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.411567 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerDied","Data":"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df"} Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.737050 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.737390 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.737439 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.738544 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:13:44 crc kubenswrapper[4929]: I1002 13:13:44.738605 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" gracePeriod=600 Oct 02 13:13:44 crc kubenswrapper[4929]: E1002 13:13:44.860897 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.424921 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" exitCode=0 Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.424974 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2"} Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.425344 4929 scope.go:117] "RemoveContainer" containerID="99a1b9ceeefb5b10e03aadaa9dabdde98633cb99480c61cafc5d20d91d57af25" Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.426128 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:13:45 crc kubenswrapper[4929]: E1002 13:13:45.426451 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.429033 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerStarted","Data":"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63"} Oct 02 13:13:45 crc kubenswrapper[4929]: I1002 13:13:45.478037 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gs8l7" podStartSLOduration=3.027823635 podStartE2EDuration="5.4780124s" podCreationTimestamp="2025-10-02 13:13:40 +0000 UTC" firstStartedPulling="2025-10-02 13:13:42.392301789 +0000 UTC m=+7422.942668163" lastFinishedPulling="2025-10-02 13:13:44.842490564 +0000 UTC m=+7425.392856928" observedRunningTime="2025-10-02 13:13:45.471401588 +0000 UTC m=+7426.021767952" watchObservedRunningTime="2025-10-02 13:13:45.4780124 +0000 UTC m=+7426.028378764" Oct 02 13:13:51 crc kubenswrapper[4929]: I1002 13:13:51.072532 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:51 crc kubenswrapper[4929]: I1002 13:13:51.073119 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:51 crc kubenswrapper[4929]: I1002 13:13:51.134723 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:51 crc kubenswrapper[4929]: I1002 13:13:51.538898 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:51 crc kubenswrapper[4929]: I1002 13:13:51.605662 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:53 crc kubenswrapper[4929]: I1002 13:13:53.498660 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gs8l7" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="registry-server" containerID="cri-o://11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63" gracePeriod=2 Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.022482 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.092498 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content\") pod \"58030b7a-d369-4087-b812-1b203bda8e32\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.092562 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities\") pod \"58030b7a-d369-4087-b812-1b203bda8e32\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.092723 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blkvw\" (UniqueName: \"kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw\") pod \"58030b7a-d369-4087-b812-1b203bda8e32\" (UID: \"58030b7a-d369-4087-b812-1b203bda8e32\") " Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.093976 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities" (OuterVolumeSpecName: "utilities") pod "58030b7a-d369-4087-b812-1b203bda8e32" (UID: "58030b7a-d369-4087-b812-1b203bda8e32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.133205 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw" (OuterVolumeSpecName: "kube-api-access-blkvw") pod "58030b7a-d369-4087-b812-1b203bda8e32" (UID: "58030b7a-d369-4087-b812-1b203bda8e32"). InnerVolumeSpecName "kube-api-access-blkvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.199457 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.199677 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blkvw\" (UniqueName: \"kubernetes.io/projected/58030b7a-d369-4087-b812-1b203bda8e32-kube-api-access-blkvw\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.513324 4929 generic.go:334] "Generic (PLEG): container finished" podID="58030b7a-d369-4087-b812-1b203bda8e32" containerID="11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63" exitCode=0 Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.513394 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerDied","Data":"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63"} Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.513715 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gs8l7" event={"ID":"58030b7a-d369-4087-b812-1b203bda8e32","Type":"ContainerDied","Data":"bd2c1b9fcfbb6b5354ee097dd15338467e6ca0c92ad2e63f62c2d29e08cee32d"} Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.513745 4929 scope.go:117] "RemoveContainer" containerID="11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.513552 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gs8l7" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.540129 4929 scope.go:117] "RemoveContainer" containerID="1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.571361 4929 scope.go:117] "RemoveContainer" containerID="59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.623517 4929 scope.go:117] "RemoveContainer" containerID="11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63" Oct 02 13:13:54 crc kubenswrapper[4929]: E1002 13:13:54.624015 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63\": container with ID starting with 11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63 not found: ID does not exist" containerID="11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.624067 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63"} err="failed to get container status \"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63\": rpc error: code = NotFound desc = could not find container \"11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63\": container with ID starting with 11c698ce8479247580c6a8e2c35ba273c9e201e12b4b7edc7cd8b50fb50f8a63 not found: ID does not exist" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.624115 4929 scope.go:117] "RemoveContainer" containerID="1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df" Oct 02 13:13:54 crc kubenswrapper[4929]: E1002 13:13:54.624614 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df\": container with ID starting with 1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df not found: ID does not exist" containerID="1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.624644 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df"} err="failed to get container status \"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df\": rpc error: code = NotFound desc = could not find container \"1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df\": container with ID starting with 1b4cff44db670acf6847f6bd48ed2199c675ad1cbf4c928a0e6860565f42e3df not found: ID does not exist" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.624668 4929 scope.go:117] "RemoveContainer" containerID="59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867" Oct 02 13:13:54 crc kubenswrapper[4929]: E1002 13:13:54.624972 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867\": container with ID starting with 59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867 not found: ID does not exist" containerID="59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867" Oct 02 13:13:54 crc kubenswrapper[4929]: I1002 13:13:54.625002 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867"} err="failed to get container status \"59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867\": rpc error: code = NotFound desc = could not find container \"59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867\": container with ID starting with 59cd94cb268e4fd6339985480d8cf06dd741580d090b5e97d4456ebbeb614867 not found: ID does not exist" Oct 02 13:13:55 crc kubenswrapper[4929]: I1002 13:13:55.142561 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58030b7a-d369-4087-b812-1b203bda8e32" (UID: "58030b7a-d369-4087-b812-1b203bda8e32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:13:55 crc kubenswrapper[4929]: I1002 13:13:55.220440 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58030b7a-d369-4087-b812-1b203bda8e32-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:55 crc kubenswrapper[4929]: I1002 13:13:55.454914 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:55 crc kubenswrapper[4929]: I1002 13:13:55.463603 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gs8l7"] Oct 02 13:13:56 crc kubenswrapper[4929]: I1002 13:13:56.171595 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58030b7a-d369-4087-b812-1b203bda8e32" path="/var/lib/kubelet/pods/58030b7a-d369-4087-b812-1b203bda8e32/volumes" Oct 02 13:13:57 crc kubenswrapper[4929]: I1002 13:13:57.157191 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:13:57 crc kubenswrapper[4929]: E1002 13:13:57.157813 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:14:12 crc kubenswrapper[4929]: I1002 13:14:12.157974 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:14:12 crc kubenswrapper[4929]: E1002 13:14:12.158642 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:14:24 crc kubenswrapper[4929]: I1002 13:14:24.158153 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:14:24 crc kubenswrapper[4929]: E1002 13:14:24.159028 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:14:39 crc kubenswrapper[4929]: I1002 13:14:39.159430 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:14:39 crc kubenswrapper[4929]: E1002 13:14:39.160170 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:14:53 crc kubenswrapper[4929]: I1002 13:14:53.156850 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:14:53 crc kubenswrapper[4929]: E1002 13:14:53.157590 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.189115 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69"] Oct 02 13:15:00 crc kubenswrapper[4929]: E1002 13:15:00.190132 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="registry-server" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.190148 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="registry-server" Oct 02 13:15:00 crc kubenswrapper[4929]: E1002 13:15:00.190176 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="extract-utilities" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.190182 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="extract-utilities" Oct 02 13:15:00 crc kubenswrapper[4929]: E1002 13:15:00.190213 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="extract-content" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.190220 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="extract-content" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.190447 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="58030b7a-d369-4087-b812-1b203bda8e32" containerName="registry-server" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.191401 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69"] Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.191503 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.194677 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.195250 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.297442 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.297574 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2524r\" (UniqueName: \"kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.297676 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.400144 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.400259 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2524r\" (UniqueName: \"kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.400322 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.401230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.406431 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.420446 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2524r\" (UniqueName: \"kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r\") pod \"collect-profiles-29323515-jgr69\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.518928 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:00 crc kubenswrapper[4929]: I1002 13:15:00.948492 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69"] Oct 02 13:15:01 crc kubenswrapper[4929]: I1002 13:15:01.129478 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" event={"ID":"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81","Type":"ContainerStarted","Data":"a6440221b8f755e930fc280af7cf90b548d1a88ef18a39b3ced3720e6d7e5139"} Oct 02 13:15:02 crc kubenswrapper[4929]: I1002 13:15:02.140512 4929 generic.go:334] "Generic (PLEG): container finished" podID="87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" containerID="1d931c3e944cafdffec2ef877bf193895a4390646d34156932bfd84099bf6fc6" exitCode=0 Oct 02 13:15:02 crc kubenswrapper[4929]: I1002 13:15:02.140603 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" event={"ID":"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81","Type":"ContainerDied","Data":"1d931c3e944cafdffec2ef877bf193895a4390646d34156932bfd84099bf6fc6"} Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.492865 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.678529 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume\") pod \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.678681 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2524r\" (UniqueName: \"kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r\") pod \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.678720 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume\") pod \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\" (UID: \"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81\") " Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.679541 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" (UID: "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.685484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" (UID: "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.685624 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r" (OuterVolumeSpecName: "kube-api-access-2524r") pod "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" (UID: "87cb8ab5-a3d3-45d7-a1e1-4809e4986e81"). InnerVolumeSpecName "kube-api-access-2524r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.780532 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.780562 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2524r\" (UniqueName: \"kubernetes.io/projected/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-kube-api-access-2524r\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:03 crc kubenswrapper[4929]: I1002 13:15:03.780573 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.156367 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:15:04 crc kubenswrapper[4929]: E1002 13:15:04.157113 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.165277 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.168939 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69" event={"ID":"87cb8ab5-a3d3-45d7-a1e1-4809e4986e81","Type":"ContainerDied","Data":"a6440221b8f755e930fc280af7cf90b548d1a88ef18a39b3ced3720e6d7e5139"} Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.169026 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6440221b8f755e930fc280af7cf90b548d1a88ef18a39b3ced3720e6d7e5139" Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.559120 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d"] Oct 02 13:15:04 crc kubenswrapper[4929]: I1002 13:15:04.567527 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-qh64d"] Oct 02 13:15:06 crc kubenswrapper[4929]: I1002 13:15:06.171440 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa29925-8e88-47ad-86c6-ef9db20ad61c" path="/var/lib/kubelet/pods/bfa29925-8e88-47ad-86c6-ef9db20ad61c/volumes" Oct 02 13:15:11 crc kubenswrapper[4929]: I1002 13:15:11.229243 4929 generic.go:334] "Generic (PLEG): container finished" podID="c748b552-6dd6-4df2-934c-651c8c00add9" containerID="288671253e835a8e655256c4bb30d003ea17791ad141734c145b8c3b39cdf826" exitCode=0 Oct 02 13:15:11 crc kubenswrapper[4929]: I1002 13:15:11.229735 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" event={"ID":"c748b552-6dd6-4df2-934c-651c8c00add9","Type":"ContainerDied","Data":"288671253e835a8e655256c4bb30d003ea17791ad141734c145b8c3b39cdf826"} Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.699526 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.874349 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrxch\" (UniqueName: \"kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch\") pod \"c748b552-6dd6-4df2-934c-651c8c00add9\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.874531 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key\") pod \"c748b552-6dd6-4df2-934c-651c8c00add9\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.874568 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph\") pod \"c748b552-6dd6-4df2-934c-651c8c00add9\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.874606 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory\") pod \"c748b552-6dd6-4df2-934c-651c8c00add9\" (UID: \"c748b552-6dd6-4df2-934c-651c8c00add9\") " Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.879446 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph" (OuterVolumeSpecName: "ceph") pod "c748b552-6dd6-4df2-934c-651c8c00add9" (UID: "c748b552-6dd6-4df2-934c-651c8c00add9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.882507 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch" (OuterVolumeSpecName: "kube-api-access-nrxch") pod "c748b552-6dd6-4df2-934c-651c8c00add9" (UID: "c748b552-6dd6-4df2-934c-651c8c00add9"). InnerVolumeSpecName "kube-api-access-nrxch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.903122 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c748b552-6dd6-4df2-934c-651c8c00add9" (UID: "c748b552-6dd6-4df2-934c-651c8c00add9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.908596 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory" (OuterVolumeSpecName: "inventory") pod "c748b552-6dd6-4df2-934c-651c8c00add9" (UID: "c748b552-6dd6-4df2-934c-651c8c00add9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.979382 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.979970 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.979986 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c748b552-6dd6-4df2-934c-651c8c00add9-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:12 crc kubenswrapper[4929]: I1002 13:15:12.980005 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrxch\" (UniqueName: \"kubernetes.io/projected/c748b552-6dd6-4df2-934c-651c8c00add9-kube-api-access-nrxch\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.252127 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" event={"ID":"c748b552-6dd6-4df2-934c-651c8c00add9","Type":"ContainerDied","Data":"07e4d74f59a658a034b2dd514e4e0836569bd77cbc77be94fe277f671b36e21d"} Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.252178 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e4d74f59a658a034b2dd514e4e0836569bd77cbc77be94fe277f671b36e21d" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.252194 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-dvjbj" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.354874 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-cz59f"] Oct 02 13:15:13 crc kubenswrapper[4929]: E1002 13:15:13.355297 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c748b552-6dd6-4df2-934c-651c8c00add9" containerName="download-cache-openstack-openstack-cell1" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.355312 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c748b552-6dd6-4df2-934c-651c8c00add9" containerName="download-cache-openstack-openstack-cell1" Oct 02 13:15:13 crc kubenswrapper[4929]: E1002 13:15:13.355323 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.355329 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.355560 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c748b552-6dd6-4df2-934c-651c8c00add9" containerName="download-cache-openstack-openstack-cell1" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.355578 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" containerName="collect-profiles" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.356311 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.359107 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.359278 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.359286 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.359380 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.378517 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-cz59f"] Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.500479 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.500570 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.500653 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djk28\" (UniqueName: \"kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.500698 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.602378 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djk28\" (UniqueName: \"kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.602443 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.602600 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.602635 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.606738 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.607110 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.614693 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.617903 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djk28\" (UniqueName: \"kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28\") pod \"configure-network-openstack-openstack-cell1-cz59f\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:13 crc kubenswrapper[4929]: I1002 13:15:13.674240 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:15:14 crc kubenswrapper[4929]: I1002 13:15:14.239354 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-cz59f"] Oct 02 13:15:14 crc kubenswrapper[4929]: I1002 13:15:14.264395 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" event={"ID":"b58c4837-52b8-431e-b20a-1ab2fd041640","Type":"ContainerStarted","Data":"b1d21306947533f47cca954ade4a782b433a6c301abeccf5c6ace336d5845be9"} Oct 02 13:15:15 crc kubenswrapper[4929]: I1002 13:15:15.293028 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" event={"ID":"b58c4837-52b8-431e-b20a-1ab2fd041640","Type":"ContainerStarted","Data":"03aed087f20e61d666ac9aa9153ac96480ae5d27e89685e720115b340494817a"} Oct 02 13:15:15 crc kubenswrapper[4929]: I1002 13:15:15.312807 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" podStartSLOduration=1.646601549 podStartE2EDuration="2.312789286s" podCreationTimestamp="2025-10-02 13:15:13 +0000 UTC" firstStartedPulling="2025-10-02 13:15:14.230886807 +0000 UTC m=+7514.781253171" lastFinishedPulling="2025-10-02 13:15:14.897074544 +0000 UTC m=+7515.447440908" observedRunningTime="2025-10-02 13:15:15.308294976 +0000 UTC m=+7515.858661370" watchObservedRunningTime="2025-10-02 13:15:15.312789286 +0000 UTC m=+7515.863155650" Oct 02 13:15:19 crc kubenswrapper[4929]: I1002 13:15:19.157332 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:15:19 crc kubenswrapper[4929]: E1002 13:15:19.157993 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:15:28 crc kubenswrapper[4929]: I1002 13:15:28.339992 4929 scope.go:117] "RemoveContainer" containerID="3dcf81ec885be28d70ab6d39d2ed9b6f3c5248dc831b2544084ff93bd32ed526" Oct 02 13:15:35 crc kubenswrapper[4929]: I1002 13:15:35.156865 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:15:35 crc kubenswrapper[4929]: E1002 13:15:35.157634 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:15:49 crc kubenswrapper[4929]: I1002 13:15:49.157701 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:15:49 crc kubenswrapper[4929]: E1002 13:15:49.159183 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.169578 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:16:00 crc kubenswrapper[4929]: E1002 13:16:00.171457 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.644880 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.648009 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.661461 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.708383 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.708874 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.709017 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrshs\" (UniqueName: \"kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.811815 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.812021 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.812050 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrshs\" (UniqueName: \"kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.812590 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.812609 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.834090 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrshs\" (UniqueName: \"kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs\") pod \"redhat-marketplace-5p4sj\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:00 crc kubenswrapper[4929]: I1002 13:16:00.972576 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.271034 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.274219 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.283541 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.424218 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.424640 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxps\" (UniqueName: \"kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.424835 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.424917 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.527592 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.527682 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.527775 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twxps\" (UniqueName: \"kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.528579 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.528824 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.553788 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxps\" (UniqueName: \"kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps\") pod \"certified-operators-v2s5t\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.600551 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.733032 4929 generic.go:334] "Generic (PLEG): container finished" podID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerID="44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477" exitCode=0 Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.733314 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerDied","Data":"44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477"} Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.733347 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerStarted","Data":"2a2ee9451b7e66119f60d73ab495b7cd45eed558456fdc69fb5439902204df92"} Oct 02 13:16:01 crc kubenswrapper[4929]: I1002 13:16:01.734882 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:16:02 crc kubenswrapper[4929]: I1002 13:16:02.128632 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:02 crc kubenswrapper[4929]: I1002 13:16:02.744476 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1e45414-57d4-436c-940f-7c28b7d57928" containerID="7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da" exitCode=0 Oct 02 13:16:02 crc kubenswrapper[4929]: I1002 13:16:02.744530 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerDied","Data":"7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da"} Oct 02 13:16:02 crc kubenswrapper[4929]: I1002 13:16:02.744773 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerStarted","Data":"e5cb15c223364125201be8bb5e4003f8b400f33f17e0dd559f704616b3aeb3cf"} Oct 02 13:16:03 crc kubenswrapper[4929]: I1002 13:16:03.756614 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerStarted","Data":"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880"} Oct 02 13:16:03 crc kubenswrapper[4929]: I1002 13:16:03.759344 4929 generic.go:334] "Generic (PLEG): container finished" podID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerID="daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58" exitCode=0 Oct 02 13:16:03 crc kubenswrapper[4929]: I1002 13:16:03.759516 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerDied","Data":"daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58"} Oct 02 13:16:04 crc kubenswrapper[4929]: I1002 13:16:04.778857 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerStarted","Data":"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e"} Oct 02 13:16:04 crc kubenswrapper[4929]: I1002 13:16:04.806587 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5p4sj" podStartSLOduration=2.286167312 podStartE2EDuration="4.806570585s" podCreationTimestamp="2025-10-02 13:16:00 +0000 UTC" firstStartedPulling="2025-10-02 13:16:01.734588874 +0000 UTC m=+7562.284955238" lastFinishedPulling="2025-10-02 13:16:04.254992147 +0000 UTC m=+7564.805358511" observedRunningTime="2025-10-02 13:16:04.802796436 +0000 UTC m=+7565.353162800" watchObservedRunningTime="2025-10-02 13:16:04.806570585 +0000 UTC m=+7565.356936949" Oct 02 13:16:05 crc kubenswrapper[4929]: I1002 13:16:05.794277 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1e45414-57d4-436c-940f-7c28b7d57928" containerID="6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880" exitCode=0 Oct 02 13:16:05 crc kubenswrapper[4929]: I1002 13:16:05.794576 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerDied","Data":"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880"} Oct 02 13:16:06 crc kubenswrapper[4929]: I1002 13:16:06.806720 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerStarted","Data":"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7"} Oct 02 13:16:06 crc kubenswrapper[4929]: I1002 13:16:06.833704 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v2s5t" podStartSLOduration=2.252038381 podStartE2EDuration="5.833682843s" podCreationTimestamp="2025-10-02 13:16:01 +0000 UTC" firstStartedPulling="2025-10-02 13:16:02.746121099 +0000 UTC m=+7563.296487463" lastFinishedPulling="2025-10-02 13:16:06.327765571 +0000 UTC m=+7566.878131925" observedRunningTime="2025-10-02 13:16:06.823164517 +0000 UTC m=+7567.373530891" watchObservedRunningTime="2025-10-02 13:16:06.833682843 +0000 UTC m=+7567.384049207" Oct 02 13:16:10 crc kubenswrapper[4929]: I1002 13:16:10.973512 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:10 crc kubenswrapper[4929]: I1002 13:16:10.974017 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.017892 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.601343 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.602828 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.650112 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.907687 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:11 crc kubenswrapper[4929]: I1002 13:16:11.908429 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:12 crc kubenswrapper[4929]: I1002 13:16:12.849225 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:13 crc kubenswrapper[4929]: I1002 13:16:13.853506 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:13 crc kubenswrapper[4929]: I1002 13:16:13.879479 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5p4sj" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="registry-server" containerID="cri-o://6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e" gracePeriod=2 Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.159680 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:16:14 crc kubenswrapper[4929]: E1002 13:16:14.160372 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.317008 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.416860 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities\") pod \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.417405 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content\") pod \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.417560 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrshs\" (UniqueName: \"kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs\") pod \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\" (UID: \"39c7db7d-3eeb-4106-950f-9c0c976dbe7d\") " Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.418739 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities" (OuterVolumeSpecName: "utilities") pod "39c7db7d-3eeb-4106-950f-9c0c976dbe7d" (UID: "39c7db7d-3eeb-4106-950f-9c0c976dbe7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.423219 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs" (OuterVolumeSpecName: "kube-api-access-hrshs") pod "39c7db7d-3eeb-4106-950f-9c0c976dbe7d" (UID: "39c7db7d-3eeb-4106-950f-9c0c976dbe7d"). InnerVolumeSpecName "kube-api-access-hrshs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.432280 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39c7db7d-3eeb-4106-950f-9c0c976dbe7d" (UID: "39c7db7d-3eeb-4106-950f-9c0c976dbe7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.519982 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrshs\" (UniqueName: \"kubernetes.io/projected/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-kube-api-access-hrshs\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.520012 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.520021 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39c7db7d-3eeb-4106-950f-9c0c976dbe7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890181 4929 generic.go:334] "Generic (PLEG): container finished" podID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerID="6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e" exitCode=0 Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890267 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerDied","Data":"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e"} Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5p4sj" event={"ID":"39c7db7d-3eeb-4106-950f-9c0c976dbe7d","Type":"ContainerDied","Data":"2a2ee9451b7e66119f60d73ab495b7cd45eed558456fdc69fb5439902204df92"} Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890354 4929 scope.go:117] "RemoveContainer" containerID="6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890371 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5p4sj" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.890460 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v2s5t" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="registry-server" containerID="cri-o://e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7" gracePeriod=2 Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.914503 4929 scope.go:117] "RemoveContainer" containerID="daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58" Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.931324 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.941491 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5p4sj"] Oct 02 13:16:14 crc kubenswrapper[4929]: I1002 13:16:14.989371 4929 scope.go:117] "RemoveContainer" containerID="44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.084013 4929 scope.go:117] "RemoveContainer" containerID="6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.084926 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e\": container with ID starting with 6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e not found: ID does not exist" containerID="6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.085004 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e"} err="failed to get container status \"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e\": rpc error: code = NotFound desc = could not find container \"6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e\": container with ID starting with 6a91688ec086a458ca23747707385d417f3a435086c9b2375b8f41f18c57020e not found: ID does not exist" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.085041 4929 scope.go:117] "RemoveContainer" containerID="daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.085418 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58\": container with ID starting with daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58 not found: ID does not exist" containerID="daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.085473 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58"} err="failed to get container status \"daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58\": rpc error: code = NotFound desc = could not find container \"daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58\": container with ID starting with daa719f7b872c9c181fa526aef335947608692f8b0800fb36785d187a7434c58 not found: ID does not exist" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.085503 4929 scope.go:117] "RemoveContainer" containerID="44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.085940 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477\": container with ID starting with 44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477 not found: ID does not exist" containerID="44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.085980 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477"} err="failed to get container status \"44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477\": rpc error: code = NotFound desc = could not find container \"44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477\": container with ID starting with 44e2e9d9576d168e09202d5b00401a9b5c885c07e405c04e7fa7ed193e454477 not found: ID does not exist" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.405545 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.540619 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content\") pod \"d1e45414-57d4-436c-940f-7c28b7d57928\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.540798 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities\") pod \"d1e45414-57d4-436c-940f-7c28b7d57928\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.540914 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twxps\" (UniqueName: \"kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps\") pod \"d1e45414-57d4-436c-940f-7c28b7d57928\" (UID: \"d1e45414-57d4-436c-940f-7c28b7d57928\") " Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.542001 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities" (OuterVolumeSpecName: "utilities") pod "d1e45414-57d4-436c-940f-7c28b7d57928" (UID: "d1e45414-57d4-436c-940f-7c28b7d57928"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.546786 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps" (OuterVolumeSpecName: "kube-api-access-twxps") pod "d1e45414-57d4-436c-940f-7c28b7d57928" (UID: "d1e45414-57d4-436c-940f-7c28b7d57928"). InnerVolumeSpecName "kube-api-access-twxps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.585548 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1e45414-57d4-436c-940f-7c28b7d57928" (UID: "d1e45414-57d4-436c-940f-7c28b7d57928"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.644155 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.644533 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1e45414-57d4-436c-940f-7c28b7d57928-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.644789 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twxps\" (UniqueName: \"kubernetes.io/projected/d1e45414-57d4-436c-940f-7c28b7d57928-kube-api-access-twxps\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.905288 4929 generic.go:334] "Generic (PLEG): container finished" podID="d1e45414-57d4-436c-940f-7c28b7d57928" containerID="e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7" exitCode=0 Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.905344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerDied","Data":"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7"} Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.905403 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v2s5t" event={"ID":"d1e45414-57d4-436c-940f-7c28b7d57928","Type":"ContainerDied","Data":"e5cb15c223364125201be8bb5e4003f8b400f33f17e0dd559f704616b3aeb3cf"} Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.905428 4929 scope.go:117] "RemoveContainer" containerID="e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.905474 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v2s5t" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.951667 4929 scope.go:117] "RemoveContainer" containerID="6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.955835 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.966518 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v2s5t"] Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.975078 4929 scope.go:117] "RemoveContainer" containerID="7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.995797 4929 scope.go:117] "RemoveContainer" containerID="e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.996367 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7\": container with ID starting with e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7 not found: ID does not exist" containerID="e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.996426 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7"} err="failed to get container status \"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7\": rpc error: code = NotFound desc = could not find container \"e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7\": container with ID starting with e7ac1ca42eb83d7164ff7905d01182d2d12eef434201fb1e807119a42a6c9ac7 not found: ID does not exist" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.996457 4929 scope.go:117] "RemoveContainer" containerID="6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.996800 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880\": container with ID starting with 6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880 not found: ID does not exist" containerID="6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.996856 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880"} err="failed to get container status \"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880\": rpc error: code = NotFound desc = could not find container \"6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880\": container with ID starting with 6f615454793e7ed96294670c03c108468bc0bf4421c5c2c2f2025fb5d6da8880 not found: ID does not exist" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.996888 4929 scope.go:117] "RemoveContainer" containerID="7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da" Oct 02 13:16:15 crc kubenswrapper[4929]: E1002 13:16:15.997228 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da\": container with ID starting with 7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da not found: ID does not exist" containerID="7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da" Oct 02 13:16:15 crc kubenswrapper[4929]: I1002 13:16:15.997254 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da"} err="failed to get container status \"7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da\": rpc error: code = NotFound desc = could not find container \"7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da\": container with ID starting with 7625f2259e2f2fa5364e4be72bcf3af239c642aae1ae98972d294303667038da not found: ID does not exist" Oct 02 13:16:16 crc kubenswrapper[4929]: I1002 13:16:16.170107 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" path="/var/lib/kubelet/pods/39c7db7d-3eeb-4106-950f-9c0c976dbe7d/volumes" Oct 02 13:16:16 crc kubenswrapper[4929]: I1002 13:16:16.170835 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" path="/var/lib/kubelet/pods/d1e45414-57d4-436c-940f-7c28b7d57928/volumes" Oct 02 13:16:27 crc kubenswrapper[4929]: I1002 13:16:27.156691 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:16:27 crc kubenswrapper[4929]: E1002 13:16:27.158526 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:16:35 crc kubenswrapper[4929]: I1002 13:16:35.073949 4929 generic.go:334] "Generic (PLEG): container finished" podID="b58c4837-52b8-431e-b20a-1ab2fd041640" containerID="03aed087f20e61d666ac9aa9153ac96480ae5d27e89685e720115b340494817a" exitCode=0 Oct 02 13:16:35 crc kubenswrapper[4929]: I1002 13:16:35.073986 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" event={"ID":"b58c4837-52b8-431e-b20a-1ab2fd041640","Type":"ContainerDied","Data":"03aed087f20e61d666ac9aa9153ac96480ae5d27e89685e720115b340494817a"} Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.576828 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.708217 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djk28\" (UniqueName: \"kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28\") pod \"b58c4837-52b8-431e-b20a-1ab2fd041640\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.708302 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory\") pod \"b58c4837-52b8-431e-b20a-1ab2fd041640\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.708462 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key\") pod \"b58c4837-52b8-431e-b20a-1ab2fd041640\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.708579 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph\") pod \"b58c4837-52b8-431e-b20a-1ab2fd041640\" (UID: \"b58c4837-52b8-431e-b20a-1ab2fd041640\") " Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.722714 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28" (OuterVolumeSpecName: "kube-api-access-djk28") pod "b58c4837-52b8-431e-b20a-1ab2fd041640" (UID: "b58c4837-52b8-431e-b20a-1ab2fd041640"). InnerVolumeSpecName "kube-api-access-djk28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.724288 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph" (OuterVolumeSpecName: "ceph") pod "b58c4837-52b8-431e-b20a-1ab2fd041640" (UID: "b58c4837-52b8-431e-b20a-1ab2fd041640"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.738536 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory" (OuterVolumeSpecName: "inventory") pod "b58c4837-52b8-431e-b20a-1ab2fd041640" (UID: "b58c4837-52b8-431e-b20a-1ab2fd041640"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.743029 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b58c4837-52b8-431e-b20a-1ab2fd041640" (UID: "b58c4837-52b8-431e-b20a-1ab2fd041640"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.811240 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.811278 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.811288 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djk28\" (UniqueName: \"kubernetes.io/projected/b58c4837-52b8-431e-b20a-1ab2fd041640-kube-api-access-djk28\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:36 crc kubenswrapper[4929]: I1002 13:16:36.811300 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b58c4837-52b8-431e-b20a-1ab2fd041640-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.094024 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" event={"ID":"b58c4837-52b8-431e-b20a-1ab2fd041640","Type":"ContainerDied","Data":"b1d21306947533f47cca954ade4a782b433a6c301abeccf5c6ace336d5845be9"} Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.094073 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d21306947533f47cca954ade4a782b433a6c301abeccf5c6ace336d5845be9" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.094087 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-cz59f" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.166740 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ks8s7"] Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167276 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167296 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167311 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167319 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167333 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="extract-content" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167341 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="extract-content" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167354 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="extract-utilities" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167359 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="extract-utilities" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167378 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c4837-52b8-431e-b20a-1ab2fd041640" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167385 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c4837-52b8-431e-b20a-1ab2fd041640" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167403 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="extract-utilities" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167409 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="extract-utilities" Oct 02 13:16:37 crc kubenswrapper[4929]: E1002 13:16:37.167418 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="extract-content" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167423 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="extract-content" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167632 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c7db7d-3eeb-4106-950f-9c0c976dbe7d" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167649 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e45414-57d4-436c-940f-7c28b7d57928" containerName="registry-server" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.167664 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58c4837-52b8-431e-b20a-1ab2fd041640" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.168664 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.171043 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.171082 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.171337 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.172144 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.179118 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ks8s7"] Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.319659 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.319718 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.319783 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.320256 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmqf\" (UniqueName: \"kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.422469 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.422650 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmqf\" (UniqueName: \"kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.422727 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.422769 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.426754 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.426782 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.427435 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.441632 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmqf\" (UniqueName: \"kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf\") pod \"validate-network-openstack-openstack-cell1-ks8s7\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:37 crc kubenswrapper[4929]: I1002 13:16:37.486120 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:38 crc kubenswrapper[4929]: I1002 13:16:38.011054 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ks8s7"] Oct 02 13:16:38 crc kubenswrapper[4929]: W1002 13:16:38.026393 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724f6796_c8d6_4bf3_9623_47da1ed4754f.slice/crio-cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8 WatchSource:0}: Error finding container cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8: Status 404 returned error can't find the container with id cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8 Oct 02 13:16:38 crc kubenswrapper[4929]: I1002 13:16:38.106108 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" event={"ID":"724f6796-c8d6-4bf3-9623-47da1ed4754f","Type":"ContainerStarted","Data":"cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8"} Oct 02 13:16:39 crc kubenswrapper[4929]: I1002 13:16:39.117581 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" event={"ID":"724f6796-c8d6-4bf3-9623-47da1ed4754f","Type":"ContainerStarted","Data":"67f7219c47fb4143fc2252dfc9856ca090f65f094fb41f68450c61eb97744880"} Oct 02 13:16:39 crc kubenswrapper[4929]: I1002 13:16:39.140615 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" podStartSLOduration=1.510382131 podStartE2EDuration="2.140597183s" podCreationTimestamp="2025-10-02 13:16:37 +0000 UTC" firstStartedPulling="2025-10-02 13:16:38.029811905 +0000 UTC m=+7598.580178269" lastFinishedPulling="2025-10-02 13:16:38.660026957 +0000 UTC m=+7599.210393321" observedRunningTime="2025-10-02 13:16:39.1311796 +0000 UTC m=+7599.681545964" watchObservedRunningTime="2025-10-02 13:16:39.140597183 +0000 UTC m=+7599.690963547" Oct 02 13:16:40 crc kubenswrapper[4929]: I1002 13:16:40.165915 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:16:40 crc kubenswrapper[4929]: E1002 13:16:40.166434 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:16:44 crc kubenswrapper[4929]: I1002 13:16:44.168668 4929 generic.go:334] "Generic (PLEG): container finished" podID="724f6796-c8d6-4bf3-9623-47da1ed4754f" containerID="67f7219c47fb4143fc2252dfc9856ca090f65f094fb41f68450c61eb97744880" exitCode=0 Oct 02 13:16:44 crc kubenswrapper[4929]: I1002 13:16:44.170812 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" event={"ID":"724f6796-c8d6-4bf3-9623-47da1ed4754f","Type":"ContainerDied","Data":"67f7219c47fb4143fc2252dfc9856ca090f65f094fb41f68450c61eb97744880"} Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.626398 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.709690 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph\") pod \"724f6796-c8d6-4bf3-9623-47da1ed4754f\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.709979 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory\") pod \"724f6796-c8d6-4bf3-9623-47da1ed4754f\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.710177 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key\") pod \"724f6796-c8d6-4bf3-9623-47da1ed4754f\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.710520 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsmqf\" (UniqueName: \"kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf\") pod \"724f6796-c8d6-4bf3-9623-47da1ed4754f\" (UID: \"724f6796-c8d6-4bf3-9623-47da1ed4754f\") " Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.715537 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph" (OuterVolumeSpecName: "ceph") pod "724f6796-c8d6-4bf3-9623-47da1ed4754f" (UID: "724f6796-c8d6-4bf3-9623-47da1ed4754f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.715775 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf" (OuterVolumeSpecName: "kube-api-access-tsmqf") pod "724f6796-c8d6-4bf3-9623-47da1ed4754f" (UID: "724f6796-c8d6-4bf3-9623-47da1ed4754f"). InnerVolumeSpecName "kube-api-access-tsmqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.738271 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory" (OuterVolumeSpecName: "inventory") pod "724f6796-c8d6-4bf3-9623-47da1ed4754f" (UID: "724f6796-c8d6-4bf3-9623-47da1ed4754f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.739114 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "724f6796-c8d6-4bf3-9623-47da1ed4754f" (UID: "724f6796-c8d6-4bf3-9623-47da1ed4754f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.814021 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsmqf\" (UniqueName: \"kubernetes.io/projected/724f6796-c8d6-4bf3-9623-47da1ed4754f-kube-api-access-tsmqf\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.814054 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.814064 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:45 crc kubenswrapper[4929]: I1002 13:16:45.814073 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724f6796-c8d6-4bf3-9623-47da1ed4754f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.189174 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" event={"ID":"724f6796-c8d6-4bf3-9623-47da1ed4754f","Type":"ContainerDied","Data":"cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8"} Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.189468 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf11ee62eb411572ba3b27ea1be7925322441f60e582c4395bcd227f1d182ab8" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.189248 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ks8s7" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.264237 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2wzfc"] Oct 02 13:16:46 crc kubenswrapper[4929]: E1002 13:16:46.264748 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724f6796-c8d6-4bf3-9623-47da1ed4754f" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.264768 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="724f6796-c8d6-4bf3-9623-47da1ed4754f" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.264998 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="724f6796-c8d6-4bf3-9623-47da1ed4754f" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.265798 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.267680 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.268380 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.268756 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.270985 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.272406 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2wzfc"] Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.322885 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7hgc\" (UniqueName: \"kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.322981 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.323372 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.323444 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.425421 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7hgc\" (UniqueName: \"kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.425530 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.425617 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.425637 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.429553 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.434494 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.437400 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.441764 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7hgc\" (UniqueName: \"kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc\") pod \"install-os-openstack-openstack-cell1-2wzfc\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:46 crc kubenswrapper[4929]: I1002 13:16:46.588467 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:16:47 crc kubenswrapper[4929]: I1002 13:16:47.128081 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2wzfc"] Oct 02 13:16:47 crc kubenswrapper[4929]: I1002 13:16:47.198546 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" event={"ID":"c48e38fe-d536-4f2b-9d05-4831d3af9490","Type":"ContainerStarted","Data":"7f7210abc371b05a16760e1a352bc107c1e4c5375299766a5c0b576be0ec1b69"} Oct 02 13:16:48 crc kubenswrapper[4929]: I1002 13:16:48.210240 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" event={"ID":"c48e38fe-d536-4f2b-9d05-4831d3af9490","Type":"ContainerStarted","Data":"392905e0980832cae562dbb2d625d832b6172fa08c32d5ce3089cd229aa473b6"} Oct 02 13:16:48 crc kubenswrapper[4929]: I1002 13:16:48.239553 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" podStartSLOduration=1.813371134 podStartE2EDuration="2.23952636s" podCreationTimestamp="2025-10-02 13:16:46 +0000 UTC" firstStartedPulling="2025-10-02 13:16:47.131466791 +0000 UTC m=+7607.681833155" lastFinishedPulling="2025-10-02 13:16:47.557622007 +0000 UTC m=+7608.107988381" observedRunningTime="2025-10-02 13:16:48.225916615 +0000 UTC m=+7608.776282999" watchObservedRunningTime="2025-10-02 13:16:48.23952636 +0000 UTC m=+7608.789892724" Oct 02 13:16:54 crc kubenswrapper[4929]: I1002 13:16:54.157422 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:16:54 crc kubenswrapper[4929]: E1002 13:16:54.158232 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:17:06 crc kubenswrapper[4929]: I1002 13:17:06.157087 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:17:06 crc kubenswrapper[4929]: E1002 13:17:06.157823 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:17:21 crc kubenswrapper[4929]: I1002 13:17:21.156403 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:17:21 crc kubenswrapper[4929]: E1002 13:17:21.157174 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:17:31 crc kubenswrapper[4929]: I1002 13:17:31.652220 4929 generic.go:334] "Generic (PLEG): container finished" podID="c48e38fe-d536-4f2b-9d05-4831d3af9490" containerID="392905e0980832cae562dbb2d625d832b6172fa08c32d5ce3089cd229aa473b6" exitCode=0 Oct 02 13:17:31 crc kubenswrapper[4929]: I1002 13:17:31.652306 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" event={"ID":"c48e38fe-d536-4f2b-9d05-4831d3af9490","Type":"ContainerDied","Data":"392905e0980832cae562dbb2d625d832b6172fa08c32d5ce3089cd229aa473b6"} Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.131062 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.195896 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph\") pod \"c48e38fe-d536-4f2b-9d05-4831d3af9490\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.196009 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key\") pod \"c48e38fe-d536-4f2b-9d05-4831d3af9490\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.196201 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7hgc\" (UniqueName: \"kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc\") pod \"c48e38fe-d536-4f2b-9d05-4831d3af9490\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.196329 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory\") pod \"c48e38fe-d536-4f2b-9d05-4831d3af9490\" (UID: \"c48e38fe-d536-4f2b-9d05-4831d3af9490\") " Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.201698 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph" (OuterVolumeSpecName: "ceph") pod "c48e38fe-d536-4f2b-9d05-4831d3af9490" (UID: "c48e38fe-d536-4f2b-9d05-4831d3af9490"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.201884 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc" (OuterVolumeSpecName: "kube-api-access-k7hgc") pod "c48e38fe-d536-4f2b-9d05-4831d3af9490" (UID: "c48e38fe-d536-4f2b-9d05-4831d3af9490"). InnerVolumeSpecName "kube-api-access-k7hgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.230340 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c48e38fe-d536-4f2b-9d05-4831d3af9490" (UID: "c48e38fe-d536-4f2b-9d05-4831d3af9490"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.232792 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory" (OuterVolumeSpecName: "inventory") pod "c48e38fe-d536-4f2b-9d05-4831d3af9490" (UID: "c48e38fe-d536-4f2b-9d05-4831d3af9490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.298651 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.298681 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.298693 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c48e38fe-d536-4f2b-9d05-4831d3af9490-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.298703 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7hgc\" (UniqueName: \"kubernetes.io/projected/c48e38fe-d536-4f2b-9d05-4831d3af9490-kube-api-access-k7hgc\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.676401 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" event={"ID":"c48e38fe-d536-4f2b-9d05-4831d3af9490","Type":"ContainerDied","Data":"7f7210abc371b05a16760e1a352bc107c1e4c5375299766a5c0b576be0ec1b69"} Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.676464 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2wzfc" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.676483 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f7210abc371b05a16760e1a352bc107c1e4c5375299766a5c0b576be0ec1b69" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.763657 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dz8bz"] Oct 02 13:17:33 crc kubenswrapper[4929]: E1002 13:17:33.765083 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48e38fe-d536-4f2b-9d05-4831d3af9490" containerName="install-os-openstack-openstack-cell1" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.765116 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48e38fe-d536-4f2b-9d05-4831d3af9490" containerName="install-os-openstack-openstack-cell1" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.765511 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48e38fe-d536-4f2b-9d05-4831d3af9490" containerName="install-os-openstack-openstack-cell1" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.767025 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.770304 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.770555 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.770754 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.775484 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dz8bz"] Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.812499 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.814649 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.814726 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.814765 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92tx\" (UniqueName: \"kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.814809 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.917626 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.917718 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.917766 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92tx\" (UniqueName: \"kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.917813 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.922442 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.922555 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.924938 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:33 crc kubenswrapper[4929]: I1002 13:17:33.939726 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92tx\" (UniqueName: \"kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx\") pod \"configure-os-openstack-openstack-cell1-dz8bz\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:34 crc kubenswrapper[4929]: I1002 13:17:34.154587 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:17:34 crc kubenswrapper[4929]: I1002 13:17:34.707945 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dz8bz"] Oct 02 13:17:35 crc kubenswrapper[4929]: I1002 13:17:35.696940 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" event={"ID":"5c519b63-6b84-40a0-a2fa-d28e907d8c5d","Type":"ContainerStarted","Data":"e23766ea7ad1666a4e4b6ba4f7d25e9233c0bc1048696e6826c626cd397b7566"} Oct 02 13:17:35 crc kubenswrapper[4929]: I1002 13:17:35.699379 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" event={"ID":"5c519b63-6b84-40a0-a2fa-d28e907d8c5d","Type":"ContainerStarted","Data":"2ca4ba22210e79c467e9178fd5360ec4627070bf8369f0f586cf01a1e1dc4e16"} Oct 02 13:17:35 crc kubenswrapper[4929]: I1002 13:17:35.716379 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" podStartSLOduration=2.142692348 podStartE2EDuration="2.716361297s" podCreationTimestamp="2025-10-02 13:17:33 +0000 UTC" firstStartedPulling="2025-10-02 13:17:34.726499432 +0000 UTC m=+7655.276865796" lastFinishedPulling="2025-10-02 13:17:35.300168381 +0000 UTC m=+7655.850534745" observedRunningTime="2025-10-02 13:17:35.71404791 +0000 UTC m=+7656.264414274" watchObservedRunningTime="2025-10-02 13:17:35.716361297 +0000 UTC m=+7656.266727661" Oct 02 13:17:36 crc kubenswrapper[4929]: I1002 13:17:36.157401 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:17:36 crc kubenswrapper[4929]: E1002 13:17:36.158108 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:17:48 crc kubenswrapper[4929]: I1002 13:17:48.157807 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:17:48 crc kubenswrapper[4929]: E1002 13:17:48.158705 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:18:00 crc kubenswrapper[4929]: I1002 13:18:00.166731 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:18:00 crc kubenswrapper[4929]: E1002 13:18:00.172293 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:18:13 crc kubenswrapper[4929]: I1002 13:18:13.157205 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:18:13 crc kubenswrapper[4929]: E1002 13:18:13.157915 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:18:18 crc kubenswrapper[4929]: I1002 13:18:18.104214 4929 generic.go:334] "Generic (PLEG): container finished" podID="5c519b63-6b84-40a0-a2fa-d28e907d8c5d" containerID="e23766ea7ad1666a4e4b6ba4f7d25e9233c0bc1048696e6826c626cd397b7566" exitCode=0 Oct 02 13:18:18 crc kubenswrapper[4929]: I1002 13:18:18.104273 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" event={"ID":"5c519b63-6b84-40a0-a2fa-d28e907d8c5d","Type":"ContainerDied","Data":"e23766ea7ad1666a4e4b6ba4f7d25e9233c0bc1048696e6826c626cd397b7566"} Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.539525 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.621239 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92tx\" (UniqueName: \"kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx\") pod \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.621520 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory\") pod \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.621556 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key\") pod \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.621760 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph\") pod \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\" (UID: \"5c519b63-6b84-40a0-a2fa-d28e907d8c5d\") " Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.627810 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph" (OuterVolumeSpecName: "ceph") pod "5c519b63-6b84-40a0-a2fa-d28e907d8c5d" (UID: "5c519b63-6b84-40a0-a2fa-d28e907d8c5d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.629280 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx" (OuterVolumeSpecName: "kube-api-access-l92tx") pod "5c519b63-6b84-40a0-a2fa-d28e907d8c5d" (UID: "5c519b63-6b84-40a0-a2fa-d28e907d8c5d"). InnerVolumeSpecName "kube-api-access-l92tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.651729 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c519b63-6b84-40a0-a2fa-d28e907d8c5d" (UID: "5c519b63-6b84-40a0-a2fa-d28e907d8c5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.652088 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory" (OuterVolumeSpecName: "inventory") pod "5c519b63-6b84-40a0-a2fa-d28e907d8c5d" (UID: "5c519b63-6b84-40a0-a2fa-d28e907d8c5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.724086 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.724117 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92tx\" (UniqueName: \"kubernetes.io/projected/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-kube-api-access-l92tx\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.724128 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:19 crc kubenswrapper[4929]: I1002 13:18:19.724137 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c519b63-6b84-40a0-a2fa-d28e907d8c5d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.142328 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" event={"ID":"5c519b63-6b84-40a0-a2fa-d28e907d8c5d","Type":"ContainerDied","Data":"2ca4ba22210e79c467e9178fd5360ec4627070bf8369f0f586cf01a1e1dc4e16"} Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.142390 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca4ba22210e79c467e9178fd5360ec4627070bf8369f0f586cf01a1e1dc4e16" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.142424 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dz8bz" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.215144 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-vzp5g"] Oct 02 13:18:20 crc kubenswrapper[4929]: E1002 13:18:20.215618 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c519b63-6b84-40a0-a2fa-d28e907d8c5d" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.215639 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c519b63-6b84-40a0-a2fa-d28e907d8c5d" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.215857 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c519b63-6b84-40a0-a2fa-d28e907d8c5d" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.216842 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.225527 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.225670 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.225707 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.225719 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.234621 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-vzp5g"] Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.337452 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.337767 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.337887 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.337938 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dltcg\" (UniqueName: \"kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.439945 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.440097 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dltcg\" (UniqueName: \"kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.440207 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.440287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.444706 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.445114 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.445256 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.464759 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dltcg\" (UniqueName: \"kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg\") pod \"ssh-known-hosts-openstack-vzp5g\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:20 crc kubenswrapper[4929]: I1002 13:18:20.539834 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:21 crc kubenswrapper[4929]: I1002 13:18:21.063128 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-vzp5g"] Oct 02 13:18:21 crc kubenswrapper[4929]: I1002 13:18:21.154160 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vzp5g" event={"ID":"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31","Type":"ContainerStarted","Data":"3eccff0f93ba966fba3b268e1a403d8df8cdaf31cd729c0576e53ee87c2edc66"} Oct 02 13:18:22 crc kubenswrapper[4929]: I1002 13:18:22.167610 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vzp5g" event={"ID":"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31","Type":"ContainerStarted","Data":"4dc4272d368c39c12d2145e3c2886064c0fde4a5cbb17825652c650d10a6d1fe"} Oct 02 13:18:22 crc kubenswrapper[4929]: I1002 13:18:22.179452 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-vzp5g" podStartSLOduration=1.711590761 podStartE2EDuration="2.179431976s" podCreationTimestamp="2025-10-02 13:18:20 +0000 UTC" firstStartedPulling="2025-10-02 13:18:21.071397029 +0000 UTC m=+7701.621763393" lastFinishedPulling="2025-10-02 13:18:21.539238244 +0000 UTC m=+7702.089604608" observedRunningTime="2025-10-02 13:18:22.17682514 +0000 UTC m=+7702.727191504" watchObservedRunningTime="2025-10-02 13:18:22.179431976 +0000 UTC m=+7702.729798350" Oct 02 13:18:25 crc kubenswrapper[4929]: I1002 13:18:25.156534 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:18:25 crc kubenswrapper[4929]: E1002 13:18:25.157170 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:18:30 crc kubenswrapper[4929]: I1002 13:18:30.237316 4929 generic.go:334] "Generic (PLEG): container finished" podID="a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" containerID="4dc4272d368c39c12d2145e3c2886064c0fde4a5cbb17825652c650d10a6d1fe" exitCode=0 Oct 02 13:18:30 crc kubenswrapper[4929]: I1002 13:18:30.237431 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vzp5g" event={"ID":"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31","Type":"ContainerDied","Data":"4dc4272d368c39c12d2145e3c2886064c0fde4a5cbb17825652c650d10a6d1fe"} Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.703995 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.785258 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1\") pod \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.785362 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0\") pod \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.785498 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph\") pod \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.785745 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dltcg\" (UniqueName: \"kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg\") pod \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\" (UID: \"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31\") " Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.791545 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg" (OuterVolumeSpecName: "kube-api-access-dltcg") pod "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" (UID: "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31"). InnerVolumeSpecName "kube-api-access-dltcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.796981 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph" (OuterVolumeSpecName: "ceph") pod "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" (UID: "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.819416 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" (UID: "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.842679 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" (UID: "a4d3a2ce-baba-4cc5-84ec-50b29f72fc31"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.889899 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dltcg\" (UniqueName: \"kubernetes.io/projected/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-kube-api-access-dltcg\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.890352 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.890408 4929 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:31 crc kubenswrapper[4929]: I1002 13:18:31.890459 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a4d3a2ce-baba-4cc5-84ec-50b29f72fc31-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.258559 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-vzp5g" event={"ID":"a4d3a2ce-baba-4cc5-84ec-50b29f72fc31","Type":"ContainerDied","Data":"3eccff0f93ba966fba3b268e1a403d8df8cdaf31cd729c0576e53ee87c2edc66"} Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.258821 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eccff0f93ba966fba3b268e1a403d8df8cdaf31cd729c0576e53ee87c2edc66" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.258610 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-vzp5g" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.330127 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-xmnfg"] Oct 02 13:18:32 crc kubenswrapper[4929]: E1002 13:18:32.331146 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" containerName="ssh-known-hosts-openstack" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.331171 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" containerName="ssh-known-hosts-openstack" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.331441 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4d3a2ce-baba-4cc5-84ec-50b29f72fc31" containerName="ssh-known-hosts-openstack" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.332363 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.339520 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.339761 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.340019 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.340178 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.351819 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-xmnfg"] Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.402047 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.402091 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56g2w\" (UniqueName: \"kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.402756 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.402887 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.505149 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.505211 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.505297 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56g2w\" (UniqueName: \"kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.505318 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.510499 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.510753 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.517755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.522271 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56g2w\" (UniqueName: \"kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w\") pod \"run-os-openstack-openstack-cell1-xmnfg\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:32 crc kubenswrapper[4929]: I1002 13:18:32.656466 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:33 crc kubenswrapper[4929]: I1002 13:18:33.197948 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-xmnfg"] Oct 02 13:18:33 crc kubenswrapper[4929]: I1002 13:18:33.269655 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" event={"ID":"3491a7e6-e824-4a76-b1ee-58e43e17cbe5","Type":"ContainerStarted","Data":"2aa87221a98f2b39a65f44b336cfcd39f435303ea8a811037587cd277e683268"} Oct 02 13:18:34 crc kubenswrapper[4929]: I1002 13:18:34.292613 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" event={"ID":"3491a7e6-e824-4a76-b1ee-58e43e17cbe5","Type":"ContainerStarted","Data":"c589d23e79a0223e242444640f2d2ec487e07d03fc1a19e76d0c65bc883c63e4"} Oct 02 13:18:34 crc kubenswrapper[4929]: I1002 13:18:34.314151 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" podStartSLOduration=1.915275495 podStartE2EDuration="2.314124138s" podCreationTimestamp="2025-10-02 13:18:32 +0000 UTC" firstStartedPulling="2025-10-02 13:18:33.191759964 +0000 UTC m=+7713.742126328" lastFinishedPulling="2025-10-02 13:18:33.590608607 +0000 UTC m=+7714.140974971" observedRunningTime="2025-10-02 13:18:34.307170866 +0000 UTC m=+7714.857537230" watchObservedRunningTime="2025-10-02 13:18:34.314124138 +0000 UTC m=+7714.864490502" Oct 02 13:18:39 crc kubenswrapper[4929]: I1002 13:18:39.157809 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:18:39 crc kubenswrapper[4929]: E1002 13:18:39.158618 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:18:42 crc kubenswrapper[4929]: I1002 13:18:42.368448 4929 generic.go:334] "Generic (PLEG): container finished" podID="3491a7e6-e824-4a76-b1ee-58e43e17cbe5" containerID="c589d23e79a0223e242444640f2d2ec487e07d03fc1a19e76d0c65bc883c63e4" exitCode=0 Oct 02 13:18:42 crc kubenswrapper[4929]: I1002 13:18:42.368493 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" event={"ID":"3491a7e6-e824-4a76-b1ee-58e43e17cbe5","Type":"ContainerDied","Data":"c589d23e79a0223e242444640f2d2ec487e07d03fc1a19e76d0c65bc883c63e4"} Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.875910 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.951406 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph\") pod \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.951531 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory\") pod \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.951578 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56g2w\" (UniqueName: \"kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w\") pod \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.951892 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key\") pod \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\" (UID: \"3491a7e6-e824-4a76-b1ee-58e43e17cbe5\") " Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.958267 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph" (OuterVolumeSpecName: "ceph") pod "3491a7e6-e824-4a76-b1ee-58e43e17cbe5" (UID: "3491a7e6-e824-4a76-b1ee-58e43e17cbe5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.966504 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w" (OuterVolumeSpecName: "kube-api-access-56g2w") pod "3491a7e6-e824-4a76-b1ee-58e43e17cbe5" (UID: "3491a7e6-e824-4a76-b1ee-58e43e17cbe5"). InnerVolumeSpecName "kube-api-access-56g2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.985670 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3491a7e6-e824-4a76-b1ee-58e43e17cbe5" (UID: "3491a7e6-e824-4a76-b1ee-58e43e17cbe5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:43 crc kubenswrapper[4929]: I1002 13:18:43.993053 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory" (OuterVolumeSpecName: "inventory") pod "3491a7e6-e824-4a76-b1ee-58e43e17cbe5" (UID: "3491a7e6-e824-4a76-b1ee-58e43e17cbe5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.055128 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.055182 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.055199 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56g2w\" (UniqueName: \"kubernetes.io/projected/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-kube-api-access-56g2w\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.055211 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3491a7e6-e824-4a76-b1ee-58e43e17cbe5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.395232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" event={"ID":"3491a7e6-e824-4a76-b1ee-58e43e17cbe5","Type":"ContainerDied","Data":"2aa87221a98f2b39a65f44b336cfcd39f435303ea8a811037587cd277e683268"} Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.395284 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa87221a98f2b39a65f44b336cfcd39f435303ea8a811037587cd277e683268" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.395498 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-xmnfg" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.466948 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fgfkl"] Oct 02 13:18:44 crc kubenswrapper[4929]: E1002 13:18:44.467760 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3491a7e6-e824-4a76-b1ee-58e43e17cbe5" containerName="run-os-openstack-openstack-cell1" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.467779 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3491a7e6-e824-4a76-b1ee-58e43e17cbe5" containerName="run-os-openstack-openstack-cell1" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.468079 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3491a7e6-e824-4a76-b1ee-58e43e17cbe5" containerName="run-os-openstack-openstack-cell1" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.469146 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.472837 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.472852 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.473562 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.475345 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.494370 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fgfkl"] Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.566831 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.566906 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ctkq\" (UniqueName: \"kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.566944 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.567031 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.668927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.669224 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.669250 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ctkq\" (UniqueName: \"kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.669275 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.673797 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.673919 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.676748 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.689822 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ctkq\" (UniqueName: \"kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq\") pod \"reboot-os-openstack-openstack-cell1-fgfkl\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:44 crc kubenswrapper[4929]: I1002 13:18:44.808128 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:18:45 crc kubenswrapper[4929]: I1002 13:18:45.420014 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-fgfkl"] Oct 02 13:18:46 crc kubenswrapper[4929]: I1002 13:18:46.418455 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" event={"ID":"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3","Type":"ContainerStarted","Data":"9fd6da896623467db8114d56fe6e4b187dbcc39cb494f5f3cb31ad866cb379dd"} Oct 02 13:18:47 crc kubenswrapper[4929]: I1002 13:18:47.432122 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" event={"ID":"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3","Type":"ContainerStarted","Data":"884737e31db576934da82a93b02ccc8701c1f40e1f92969fd237149ed377e1a4"} Oct 02 13:18:47 crc kubenswrapper[4929]: I1002 13:18:47.457258 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" podStartSLOduration=2.7655730309999997 podStartE2EDuration="3.457232827s" podCreationTimestamp="2025-10-02 13:18:44 +0000 UTC" firstStartedPulling="2025-10-02 13:18:45.425913487 +0000 UTC m=+7725.976279851" lastFinishedPulling="2025-10-02 13:18:46.117573283 +0000 UTC m=+7726.667939647" observedRunningTime="2025-10-02 13:18:47.450039998 +0000 UTC m=+7728.000406372" watchObservedRunningTime="2025-10-02 13:18:47.457232827 +0000 UTC m=+7728.007599191" Oct 02 13:18:53 crc kubenswrapper[4929]: I1002 13:18:53.157431 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:18:53 crc kubenswrapper[4929]: I1002 13:18:53.489210 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8"} Oct 02 13:19:02 crc kubenswrapper[4929]: I1002 13:19:02.577585 4929 generic.go:334] "Generic (PLEG): container finished" podID="3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" containerID="884737e31db576934da82a93b02ccc8701c1f40e1f92969fd237149ed377e1a4" exitCode=0 Oct 02 13:19:02 crc kubenswrapper[4929]: I1002 13:19:02.577663 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" event={"ID":"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3","Type":"ContainerDied","Data":"884737e31db576934da82a93b02ccc8701c1f40e1f92969fd237149ed377e1a4"} Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.051163 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.118282 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key\") pod \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.118733 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory\") pod \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.118776 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ctkq\" (UniqueName: \"kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq\") pod \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.118890 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph\") pod \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\" (UID: \"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3\") " Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.125105 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq" (OuterVolumeSpecName: "kube-api-access-5ctkq") pod "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" (UID: "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3"). InnerVolumeSpecName "kube-api-access-5ctkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.125592 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph" (OuterVolumeSpecName: "ceph") pod "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" (UID: "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.154521 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory" (OuterVolumeSpecName: "inventory") pod "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" (UID: "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.159648 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" (UID: "3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.221265 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.221306 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.221321 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ctkq\" (UniqueName: \"kubernetes.io/projected/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-kube-api-access-5ctkq\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.221333 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.598925 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" event={"ID":"3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3","Type":"ContainerDied","Data":"9fd6da896623467db8114d56fe6e4b187dbcc39cb494f5f3cb31ad866cb379dd"} Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.598985 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fd6da896623467db8114d56fe6e4b187dbcc39cb494f5f3cb31ad866cb379dd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.599066 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-fgfkl" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.679730 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bwpjd"] Oct 02 13:19:04 crc kubenswrapper[4929]: E1002 13:19:04.680281 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.680301 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.681342 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.682582 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.684774 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.685057 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.685239 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.685771 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.691741 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bwpjd"] Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836062 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836140 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz9rx\" (UniqueName: \"kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836202 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836432 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836650 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.836871 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.837010 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.837310 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.837602 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.837729 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.838026 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.838374 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940310 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940366 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940430 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940496 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940539 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940557 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz9rx\" (UniqueName: \"kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940583 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940602 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940624 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940643 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940663 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.940698 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.945325 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.952448 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.953040 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.954825 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.955853 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.955876 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.956098 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.956353 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.957569 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.958184 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.959200 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:04 crc kubenswrapper[4929]: I1002 13:19:04.962903 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz9rx\" (UniqueName: \"kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx\") pod \"install-certs-openstack-openstack-cell1-bwpjd\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:05 crc kubenswrapper[4929]: I1002 13:19:05.005673 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:05 crc kubenswrapper[4929]: I1002 13:19:05.552220 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bwpjd"] Oct 02 13:19:05 crc kubenswrapper[4929]: W1002 13:19:05.554114 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0fbe02_afa8_426c_b7dd_bc9dde60e8a2.slice/crio-26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6 WatchSource:0}: Error finding container 26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6: Status 404 returned error can't find the container with id 26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6 Oct 02 13:19:05 crc kubenswrapper[4929]: I1002 13:19:05.609322 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" event={"ID":"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2","Type":"ContainerStarted","Data":"26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6"} Oct 02 13:19:06 crc kubenswrapper[4929]: I1002 13:19:06.621612 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" event={"ID":"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2","Type":"ContainerStarted","Data":"1ac525cdaebcaca1f85731883977f1cea7438ef75aed2e301cd963d722bc1f13"} Oct 02 13:19:06 crc kubenswrapper[4929]: I1002 13:19:06.651834 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" podStartSLOduration=2.228630314 podStartE2EDuration="2.651812793s" podCreationTimestamp="2025-10-02 13:19:04 +0000 UTC" firstStartedPulling="2025-10-02 13:19:05.556477254 +0000 UTC m=+7746.106843628" lastFinishedPulling="2025-10-02 13:19:05.979659743 +0000 UTC m=+7746.530026107" observedRunningTime="2025-10-02 13:19:06.64312842 +0000 UTC m=+7747.193494794" watchObservedRunningTime="2025-10-02 13:19:06.651812793 +0000 UTC m=+7747.202179157" Oct 02 13:19:25 crc kubenswrapper[4929]: I1002 13:19:25.823906 4929 generic.go:334] "Generic (PLEG): container finished" podID="3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" containerID="1ac525cdaebcaca1f85731883977f1cea7438ef75aed2e301cd963d722bc1f13" exitCode=0 Oct 02 13:19:25 crc kubenswrapper[4929]: I1002 13:19:25.824092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" event={"ID":"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2","Type":"ContainerDied","Data":"1ac525cdaebcaca1f85731883977f1cea7438ef75aed2e301cd963d722bc1f13"} Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.294802 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452472 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz9rx\" (UniqueName: \"kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452536 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452626 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452655 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452700 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452729 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452786 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452850 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452912 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452949 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.452994 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.453057 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle\") pod \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\" (UID: \"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2\") " Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.458481 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph" (OuterVolumeSpecName: "ceph") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.458510 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.459329 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460120 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460188 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460202 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx" (OuterVolumeSpecName: "kube-api-access-jz9rx") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "kube-api-access-jz9rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460218 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460272 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460518 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.460565 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.485166 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory" (OuterVolumeSpecName: "inventory") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.490538 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" (UID: "3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555734 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555779 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555788 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555799 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555808 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz9rx\" (UniqueName: \"kubernetes.io/projected/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-kube-api-access-jz9rx\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555818 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555829 4929 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555837 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555845 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555855 4929 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555865 4929 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.555874 4929 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.846318 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" event={"ID":"3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2","Type":"ContainerDied","Data":"26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6"} Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.846364 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c423726c0f807497e4315aec0aed9ce7fff3618b7874395addac45ccee93b6" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.846386 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bwpjd" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.932815 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z8dbl"] Oct 02 13:19:27 crc kubenswrapper[4929]: E1002 13:19:27.933422 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.933442 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.933643 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.934629 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.937019 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.937501 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.937629 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.938735 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:19:27 crc kubenswrapper[4929]: I1002 13:19:27.953902 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z8dbl"] Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.067099 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.067196 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.067349 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczfc\" (UniqueName: \"kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.067427 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.169401 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.169485 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.169588 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczfc\" (UniqueName: \"kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.169664 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.173647 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.173647 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.175946 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.185651 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczfc\" (UniqueName: \"kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc\") pod \"ceph-client-openstack-openstack-cell1-z8dbl\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.256830 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.809631 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-z8dbl"] Oct 02 13:19:28 crc kubenswrapper[4929]: I1002 13:19:28.858042 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" event={"ID":"6f793d5a-649b-4ef6-9935-deabf8dcd0c8","Type":"ContainerStarted","Data":"9e632912144c592f0995440c2881cd0d2d716653b18cb3fc1ac1afd9237791f1"} Oct 02 13:19:29 crc kubenswrapper[4929]: I1002 13:19:29.869909 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" event={"ID":"6f793d5a-649b-4ef6-9935-deabf8dcd0c8","Type":"ContainerStarted","Data":"fde8bbb91a9cc7c8881c759055be19eb79420d293ef6578525aa19bdfa213dcc"} Oct 02 13:19:29 crc kubenswrapper[4929]: I1002 13:19:29.896790 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" podStartSLOduration=2.376671417 podStartE2EDuration="2.896771181s" podCreationTimestamp="2025-10-02 13:19:27 +0000 UTC" firstStartedPulling="2025-10-02 13:19:28.822383331 +0000 UTC m=+7769.372749695" lastFinishedPulling="2025-10-02 13:19:29.342483095 +0000 UTC m=+7769.892849459" observedRunningTime="2025-10-02 13:19:29.89122903 +0000 UTC m=+7770.441595414" watchObservedRunningTime="2025-10-02 13:19:29.896771181 +0000 UTC m=+7770.447137545" Oct 02 13:19:34 crc kubenswrapper[4929]: I1002 13:19:34.922034 4929 generic.go:334] "Generic (PLEG): container finished" podID="6f793d5a-649b-4ef6-9935-deabf8dcd0c8" containerID="fde8bbb91a9cc7c8881c759055be19eb79420d293ef6578525aa19bdfa213dcc" exitCode=0 Oct 02 13:19:34 crc kubenswrapper[4929]: I1002 13:19:34.922482 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" event={"ID":"6f793d5a-649b-4ef6-9935-deabf8dcd0c8","Type":"ContainerDied","Data":"fde8bbb91a9cc7c8881c759055be19eb79420d293ef6578525aa19bdfa213dcc"} Oct 02 13:19:36 crc kubenswrapper[4929]: I1002 13:19:36.941952 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" event={"ID":"6f793d5a-649b-4ef6-9935-deabf8dcd0c8","Type":"ContainerDied","Data":"9e632912144c592f0995440c2881cd0d2d716653b18cb3fc1ac1afd9237791f1"} Oct 02 13:19:36 crc kubenswrapper[4929]: I1002 13:19:36.942536 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e632912144c592f0995440c2881cd0d2d716653b18cb3fc1ac1afd9237791f1" Oct 02 13:19:36 crc kubenswrapper[4929]: I1002 13:19:36.961175 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.109603 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph\") pod \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.109740 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczfc\" (UniqueName: \"kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc\") pod \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.110014 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory\") pod \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.110076 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key\") pod \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\" (UID: \"6f793d5a-649b-4ef6-9935-deabf8dcd0c8\") " Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.122213 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph" (OuterVolumeSpecName: "ceph") pod "6f793d5a-649b-4ef6-9935-deabf8dcd0c8" (UID: "6f793d5a-649b-4ef6-9935-deabf8dcd0c8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.122287 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc" (OuterVolumeSpecName: "kube-api-access-vczfc") pod "6f793d5a-649b-4ef6-9935-deabf8dcd0c8" (UID: "6f793d5a-649b-4ef6-9935-deabf8dcd0c8"). InnerVolumeSpecName "kube-api-access-vczfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.144174 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory" (OuterVolumeSpecName: "inventory") pod "6f793d5a-649b-4ef6-9935-deabf8dcd0c8" (UID: "6f793d5a-649b-4ef6-9935-deabf8dcd0c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.146463 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f793d5a-649b-4ef6-9935-deabf8dcd0c8" (UID: "6f793d5a-649b-4ef6-9935-deabf8dcd0c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.214020 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.214069 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.214084 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.214097 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vczfc\" (UniqueName: \"kubernetes.io/projected/6f793d5a-649b-4ef6-9935-deabf8dcd0c8-kube-api-access-vczfc\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:37 crc kubenswrapper[4929]: I1002 13:19:37.950852 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-z8dbl" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.060222 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gvqsj"] Oct 02 13:19:38 crc kubenswrapper[4929]: E1002 13:19:38.060788 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f793d5a-649b-4ef6-9935-deabf8dcd0c8" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.060806 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f793d5a-649b-4ef6-9935-deabf8dcd0c8" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.061029 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f793d5a-649b-4ef6-9935-deabf8dcd0c8" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.061821 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.064040 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.064040 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.064369 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.064567 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.064800 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.071310 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gvqsj"] Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133160 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jszcq\" (UniqueName: \"kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133242 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133286 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133337 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133555 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.133641 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.236952 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jszcq\" (UniqueName: \"kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.237038 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.237127 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.237157 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.238385 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.238534 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.238616 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.242459 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.242560 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.243264 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.249631 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.254206 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jszcq\" (UniqueName: \"kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq\") pod \"ovn-openstack-openstack-cell1-gvqsj\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.395942 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.927888 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-gvqsj"] Oct 02 13:19:38 crc kubenswrapper[4929]: I1002 13:19:38.961821 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" event={"ID":"d61ace99-a0d0-431c-989a-586bdff5c0de","Type":"ContainerStarted","Data":"2f674f312b291351af7286738364726aab0dda974cdfdd082cdeb4714c7df746"} Oct 02 13:19:40 crc kubenswrapper[4929]: I1002 13:19:40.983389 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" event={"ID":"d61ace99-a0d0-431c-989a-586bdff5c0de","Type":"ContainerStarted","Data":"0570433ceeb7f91652e99f8634ea382dc47d42d21b8933475ace9db49866c8da"} Oct 02 13:19:41 crc kubenswrapper[4929]: I1002 13:19:41.000590 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" podStartSLOduration=1.8909534749999999 podStartE2EDuration="3.000568957s" podCreationTimestamp="2025-10-02 13:19:38 +0000 UTC" firstStartedPulling="2025-10-02 13:19:38.927191747 +0000 UTC m=+7779.477558111" lastFinishedPulling="2025-10-02 13:19:40.036807229 +0000 UTC m=+7780.587173593" observedRunningTime="2025-10-02 13:19:40.997310862 +0000 UTC m=+7781.547677226" watchObservedRunningTime="2025-10-02 13:19:41.000568957 +0000 UTC m=+7781.550935341" Oct 02 13:20:44 crc kubenswrapper[4929]: I1002 13:20:44.626268 4929 generic.go:334] "Generic (PLEG): container finished" podID="d61ace99-a0d0-431c-989a-586bdff5c0de" containerID="0570433ceeb7f91652e99f8634ea382dc47d42d21b8933475ace9db49866c8da" exitCode=0 Oct 02 13:20:44 crc kubenswrapper[4929]: I1002 13:20:44.626364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" event={"ID":"d61ace99-a0d0-431c-989a-586bdff5c0de","Type":"ContainerDied","Data":"0570433ceeb7f91652e99f8634ea382dc47d42d21b8933475ace9db49866c8da"} Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.060632 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140139 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140239 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140266 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jszcq\" (UniqueName: \"kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140349 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140394 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.140528 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key\") pod \"d61ace99-a0d0-431c-989a-586bdff5c0de\" (UID: \"d61ace99-a0d0-431c-989a-586bdff5c0de\") " Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.157815 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.157928 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq" (OuterVolumeSpecName: "kube-api-access-jszcq") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "kube-api-access-jszcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.159331 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph" (OuterVolumeSpecName: "ceph") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.166490 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.170318 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.184714 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory" (OuterVolumeSpecName: "inventory") pod "d61ace99-a0d0-431c-989a-586bdff5c0de" (UID: "d61ace99-a0d0-431c-989a-586bdff5c0de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242645 4929 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d61ace99-a0d0-431c-989a-586bdff5c0de-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242683 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242693 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242704 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242712 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d61ace99-a0d0-431c-989a-586bdff5c0de-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.242720 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jszcq\" (UniqueName: \"kubernetes.io/projected/d61ace99-a0d0-431c-989a-586bdff5c0de-kube-api-access-jszcq\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.646866 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" event={"ID":"d61ace99-a0d0-431c-989a-586bdff5c0de","Type":"ContainerDied","Data":"2f674f312b291351af7286738364726aab0dda974cdfdd082cdeb4714c7df746"} Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.646916 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f674f312b291351af7286738364726aab0dda974cdfdd082cdeb4714c7df746" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.646920 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-gvqsj" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.744084 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-4w5vl"] Oct 02 13:20:46 crc kubenswrapper[4929]: E1002 13:20:46.744705 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61ace99-a0d0-431c-989a-586bdff5c0de" containerName="ovn-openstack-openstack-cell1" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.744730 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61ace99-a0d0-431c-989a-586bdff5c0de" containerName="ovn-openstack-openstack-cell1" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.745055 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61ace99-a0d0-431c-989a-586bdff5c0de" containerName="ovn-openstack-openstack-cell1" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.746049 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.749366 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.749625 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.749842 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.750027 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.750179 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.750505 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.758254 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-4w5vl"] Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854077 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854115 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854155 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854190 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854360 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854707 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.854818 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dv6\" (UniqueName: \"kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.956579 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.956653 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.956709 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.956816 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.956866 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dv6\" (UniqueName: \"kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.957483 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.957518 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.963744 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.963773 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.964056 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.964274 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.964352 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.964739 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:46 crc kubenswrapper[4929]: I1002 13:20:46.977437 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dv6\" (UniqueName: \"kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6\") pod \"neutron-metadata-openstack-openstack-cell1-4w5vl\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:47 crc kubenswrapper[4929]: I1002 13:20:47.063810 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:20:47 crc kubenswrapper[4929]: W1002 13:20:47.626080 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479d3827_fdee_4b7a_b659_6fc9a86f0508.slice/crio-f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213 WatchSource:0}: Error finding container f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213: Status 404 returned error can't find the container with id f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213 Oct 02 13:20:47 crc kubenswrapper[4929]: I1002 13:20:47.627362 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-4w5vl"] Oct 02 13:20:47 crc kubenswrapper[4929]: I1002 13:20:47.658972 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" event={"ID":"479d3827-fdee-4b7a-b659-6fc9a86f0508","Type":"ContainerStarted","Data":"f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213"} Oct 02 13:20:49 crc kubenswrapper[4929]: I1002 13:20:49.679494 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" event={"ID":"479d3827-fdee-4b7a-b659-6fc9a86f0508","Type":"ContainerStarted","Data":"48c5f70a91234b26377bd79c1ac89dfcf5f8733c64a9559817ccf0d006ffac56"} Oct 02 13:20:49 crc kubenswrapper[4929]: I1002 13:20:49.705357 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" podStartSLOduration=2.883224765 podStartE2EDuration="3.705336128s" podCreationTimestamp="2025-10-02 13:20:46 +0000 UTC" firstStartedPulling="2025-10-02 13:20:47.628755675 +0000 UTC m=+7848.179122049" lastFinishedPulling="2025-10-02 13:20:48.450867048 +0000 UTC m=+7849.001233412" observedRunningTime="2025-10-02 13:20:49.69818931 +0000 UTC m=+7850.248555674" watchObservedRunningTime="2025-10-02 13:20:49.705336128 +0000 UTC m=+7850.255702492" Oct 02 13:21:14 crc kubenswrapper[4929]: I1002 13:21:14.736589 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:21:14 crc kubenswrapper[4929]: I1002 13:21:14.736995 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:21:41 crc kubenswrapper[4929]: I1002 13:21:41.210322 4929 generic.go:334] "Generic (PLEG): container finished" podID="479d3827-fdee-4b7a-b659-6fc9a86f0508" containerID="48c5f70a91234b26377bd79c1ac89dfcf5f8733c64a9559817ccf0d006ffac56" exitCode=0 Oct 02 13:21:41 crc kubenswrapper[4929]: I1002 13:21:41.210422 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" event={"ID":"479d3827-fdee-4b7a-b659-6fc9a86f0508","Type":"ContainerDied","Data":"48c5f70a91234b26377bd79c1ac89dfcf5f8733c64a9559817ccf0d006ffac56"} Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.671286 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700385 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7dv6\" (UniqueName: \"kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700449 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700553 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700652 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700702 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700813 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.700901 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key\") pod \"479d3827-fdee-4b7a-b659-6fc9a86f0508\" (UID: \"479d3827-fdee-4b7a-b659-6fc9a86f0508\") " Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.707572 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.707811 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6" (OuterVolumeSpecName: "kube-api-access-z7dv6") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "kube-api-access-z7dv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.715199 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph" (OuterVolumeSpecName: "ceph") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.732139 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.734484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory" (OuterVolumeSpecName: "inventory") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.741272 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.741690 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "479d3827-fdee-4b7a-b659-6fc9a86f0508" (UID: "479d3827-fdee-4b7a-b659-6fc9a86f0508"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803734 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803770 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7dv6\" (UniqueName: \"kubernetes.io/projected/479d3827-fdee-4b7a-b659-6fc9a86f0508-kube-api-access-z7dv6\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803780 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803788 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803799 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803808 4929 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:42 crc kubenswrapper[4929]: I1002 13:21:42.803816 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/479d3827-fdee-4b7a-b659-6fc9a86f0508-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.230544 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" event={"ID":"479d3827-fdee-4b7a-b659-6fc9a86f0508","Type":"ContainerDied","Data":"f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213"} Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.230864 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4305a6fe5ffc19a2531e3dc12e7ac0faa32c1e9d104dd1b67833c0f745ec213" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.230619 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-4w5vl" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.311813 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4jxh7"] Oct 02 13:21:43 crc kubenswrapper[4929]: E1002 13:21:43.312392 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479d3827-fdee-4b7a-b659-6fc9a86f0508" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.312415 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="479d3827-fdee-4b7a-b659-6fc9a86f0508" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.312688 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="479d3827-fdee-4b7a-b659-6fc9a86f0508" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.313572 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.315506 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.315798 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.315842 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.316048 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.316208 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.323546 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4jxh7"] Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421547 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421625 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421688 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhpv\" (UniqueName: \"kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421711 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421770 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.421788 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524293 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524369 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524428 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhpv\" (UniqueName: \"kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524502 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524590 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.524608 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.528930 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.529451 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.529682 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.530108 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.531615 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.546329 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhpv\" (UniqueName: \"kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv\") pod \"libvirt-openstack-openstack-cell1-4jxh7\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:43 crc kubenswrapper[4929]: I1002 13:21:43.635975 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:21:44 crc kubenswrapper[4929]: I1002 13:21:44.154111 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-4jxh7"] Oct 02 13:21:44 crc kubenswrapper[4929]: I1002 13:21:44.159389 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:21:44 crc kubenswrapper[4929]: I1002 13:21:44.253697 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" event={"ID":"54ed1de3-f724-4239-8d13-a33ca45c5d4b","Type":"ContainerStarted","Data":"cc76d9ea62a506806057f58a4362d248b31dd5e1c126e7ec1241480ca53239ea"} Oct 02 13:21:44 crc kubenswrapper[4929]: I1002 13:21:44.736977 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:21:44 crc kubenswrapper[4929]: I1002 13:21:44.737023 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:21:45 crc kubenswrapper[4929]: I1002 13:21:45.267000 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" event={"ID":"54ed1de3-f724-4239-8d13-a33ca45c5d4b","Type":"ContainerStarted","Data":"8b55f29d695ddfa04b50e1979ca87ee7cb2dc39610e60a3905fb151f3110d3d4"} Oct 02 13:21:45 crc kubenswrapper[4929]: I1002 13:21:45.294996 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" podStartSLOduration=1.716985052 podStartE2EDuration="2.294976594s" podCreationTimestamp="2025-10-02 13:21:43 +0000 UTC" firstStartedPulling="2025-10-02 13:21:44.159060489 +0000 UTC m=+7904.709426853" lastFinishedPulling="2025-10-02 13:21:44.737052031 +0000 UTC m=+7905.287418395" observedRunningTime="2025-10-02 13:21:45.286640814 +0000 UTC m=+7905.837007178" watchObservedRunningTime="2025-10-02 13:21:45.294976594 +0000 UTC m=+7905.845342958" Oct 02 13:22:14 crc kubenswrapper[4929]: I1002 13:22:14.736687 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:22:14 crc kubenswrapper[4929]: I1002 13:22:14.737876 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:14 crc kubenswrapper[4929]: I1002 13:22:14.737993 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:22:14 crc kubenswrapper[4929]: I1002 13:22:14.739725 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:22:14 crc kubenswrapper[4929]: I1002 13:22:14.739836 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8" gracePeriod=600 Oct 02 13:22:15 crc kubenswrapper[4929]: I1002 13:22:15.556284 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8" exitCode=0 Oct 02 13:22:15 crc kubenswrapper[4929]: I1002 13:22:15.556364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8"} Oct 02 13:22:15 crc kubenswrapper[4929]: I1002 13:22:15.557279 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01"} Oct 02 13:22:15 crc kubenswrapper[4929]: I1002 13:22:15.557312 4929 scope.go:117] "RemoveContainer" containerID="960be9a5f58d703c6a42ef0c18583557d13c85dc46acee977f30b4248de71ab2" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.105669 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7dlvd"] Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.109155 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.135416 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7dlvd"] Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.301095 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dz42\" (UniqueName: \"kubernetes.io/projected/50083b96-5603-412c-b283-430e43790b81-kube-api-access-8dz42\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.301249 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-utilities\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.301412 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-catalog-content\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.403065 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-catalog-content\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.403468 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dz42\" (UniqueName: \"kubernetes.io/projected/50083b96-5603-412c-b283-430e43790b81-kube-api-access-8dz42\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.403531 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-utilities\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.403673 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-catalog-content\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.403904 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50083b96-5603-412c-b283-430e43790b81-utilities\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.423504 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dz42\" (UniqueName: \"kubernetes.io/projected/50083b96-5603-412c-b283-430e43790b81-kube-api-access-8dz42\") pod \"redhat-operators-7dlvd\" (UID: \"50083b96-5603-412c-b283-430e43790b81\") " pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.430359 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:13 crc kubenswrapper[4929]: I1002 13:24:13.916772 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7dlvd"] Oct 02 13:24:14 crc kubenswrapper[4929]: I1002 13:24:14.799129 4929 generic.go:334] "Generic (PLEG): container finished" podID="50083b96-5603-412c-b283-430e43790b81" containerID="f394e889eac420d16d8a3952bfa8dd0d42c72fc29ffa618670bc4227d1784708" exitCode=0 Oct 02 13:24:14 crc kubenswrapper[4929]: I1002 13:24:14.799394 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7dlvd" event={"ID":"50083b96-5603-412c-b283-430e43790b81","Type":"ContainerDied","Data":"f394e889eac420d16d8a3952bfa8dd0d42c72fc29ffa618670bc4227d1784708"} Oct 02 13:24:14 crc kubenswrapper[4929]: I1002 13:24:14.799426 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7dlvd" event={"ID":"50083b96-5603-412c-b283-430e43790b81","Type":"ContainerStarted","Data":"74c57700c6353e56e8121dbed6942f8e0be6d7807bc62920fd1c6a7651f50587"} Oct 02 13:24:24 crc kubenswrapper[4929]: I1002 13:24:24.915698 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7dlvd" event={"ID":"50083b96-5603-412c-b283-430e43790b81","Type":"ContainerStarted","Data":"239aa1eb5e10a8d04a9e55486cbf982722e3ef148731581c18285235eeb6c0d2"} Oct 02 13:24:25 crc kubenswrapper[4929]: I1002 13:24:25.934385 4929 generic.go:334] "Generic (PLEG): container finished" podID="50083b96-5603-412c-b283-430e43790b81" containerID="239aa1eb5e10a8d04a9e55486cbf982722e3ef148731581c18285235eeb6c0d2" exitCode=0 Oct 02 13:24:25 crc kubenswrapper[4929]: I1002 13:24:25.934615 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7dlvd" event={"ID":"50083b96-5603-412c-b283-430e43790b81","Type":"ContainerDied","Data":"239aa1eb5e10a8d04a9e55486cbf982722e3ef148731581c18285235eeb6c0d2"} Oct 02 13:24:29 crc kubenswrapper[4929]: I1002 13:24:29.006445 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7dlvd" event={"ID":"50083b96-5603-412c-b283-430e43790b81","Type":"ContainerStarted","Data":"65719f0a00a3571f992714e1f221a54925952ed433998606f21130c54b8e11e7"} Oct 02 13:24:29 crc kubenswrapper[4929]: I1002 13:24:29.030818 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7dlvd" podStartSLOduration=3.074829149 podStartE2EDuration="16.030798382s" podCreationTimestamp="2025-10-02 13:24:13 +0000 UTC" firstStartedPulling="2025-10-02 13:24:14.801582522 +0000 UTC m=+8055.351948886" lastFinishedPulling="2025-10-02 13:24:27.757551765 +0000 UTC m=+8068.307918119" observedRunningTime="2025-10-02 13:24:29.028500706 +0000 UTC m=+8069.578867080" watchObservedRunningTime="2025-10-02 13:24:29.030798382 +0000 UTC m=+8069.581164756" Oct 02 13:24:33 crc kubenswrapper[4929]: I1002 13:24:33.430941 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:33 crc kubenswrapper[4929]: I1002 13:24:33.431468 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:33 crc kubenswrapper[4929]: I1002 13:24:33.482681 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:34 crc kubenswrapper[4929]: I1002 13:24:34.128737 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7dlvd" Oct 02 13:24:34 crc kubenswrapper[4929]: I1002 13:24:34.207982 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7dlvd"] Oct 02 13:24:34 crc kubenswrapper[4929]: I1002 13:24:34.248825 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 13:24:34 crc kubenswrapper[4929]: I1002 13:24:34.249520 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z57r9" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="registry-server" containerID="cri-o://73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892" gracePeriod=2 Oct 02 13:24:34 crc kubenswrapper[4929]: I1002 13:24:34.913315 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.093565 4929 generic.go:334] "Generic (PLEG): container finished" podID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerID="73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892" exitCode=0 Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.093614 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z57r9" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.093648 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerDied","Data":"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892"} Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.093759 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z57r9" event={"ID":"94b01c90-c88b-4218-9287-e4f5df0e2677","Type":"ContainerDied","Data":"e5b033259894be4d4feef6deafe5c95b458d308b11c2335b07add7030e5ec891"} Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.093802 4929 scope.go:117] "RemoveContainer" containerID="73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.110847 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities\") pod \"94b01c90-c88b-4218-9287-e4f5df0e2677\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.110927 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2t6\" (UniqueName: \"kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6\") pod \"94b01c90-c88b-4218-9287-e4f5df0e2677\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.111039 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content\") pod \"94b01c90-c88b-4218-9287-e4f5df0e2677\" (UID: \"94b01c90-c88b-4218-9287-e4f5df0e2677\") " Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.119681 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities" (OuterVolumeSpecName: "utilities") pod "94b01c90-c88b-4218-9287-e4f5df0e2677" (UID: "94b01c90-c88b-4218-9287-e4f5df0e2677"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.121860 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6" (OuterVolumeSpecName: "kube-api-access-kh2t6") pod "94b01c90-c88b-4218-9287-e4f5df0e2677" (UID: "94b01c90-c88b-4218-9287-e4f5df0e2677"). InnerVolumeSpecName "kube-api-access-kh2t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.130559 4929 scope.go:117] "RemoveContainer" containerID="99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.194644 4929 scope.go:117] "RemoveContainer" containerID="8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.196558 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94b01c90-c88b-4218-9287-e4f5df0e2677" (UID: "94b01c90-c88b-4218-9287-e4f5df0e2677"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.214809 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.214849 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94b01c90-c88b-4218-9287-e4f5df0e2677-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.214864 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh2t6\" (UniqueName: \"kubernetes.io/projected/94b01c90-c88b-4218-9287-e4f5df0e2677-kube-api-access-kh2t6\") on node \"crc\" DevicePath \"\"" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.254952 4929 scope.go:117] "RemoveContainer" containerID="73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892" Oct 02 13:24:35 crc kubenswrapper[4929]: E1002 13:24:35.255486 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892\": container with ID starting with 73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892 not found: ID does not exist" containerID="73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.255531 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892"} err="failed to get container status \"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892\": rpc error: code = NotFound desc = could not find container \"73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892\": container with ID starting with 73d504178c70365c40d254a2de816966f9dd52f8fbfd141f7ec4ee9c533b8892 not found: ID does not exist" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.255558 4929 scope.go:117] "RemoveContainer" containerID="99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba" Oct 02 13:24:35 crc kubenswrapper[4929]: E1002 13:24:35.255930 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba\": container with ID starting with 99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba not found: ID does not exist" containerID="99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.256057 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba"} err="failed to get container status \"99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba\": rpc error: code = NotFound desc = could not find container \"99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba\": container with ID starting with 99c2badc1830ee8a54e4dc19035e1827f1bd9cb03d177590bf4415a82e209eba not found: ID does not exist" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.256087 4929 scope.go:117] "RemoveContainer" containerID="8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18" Oct 02 13:24:35 crc kubenswrapper[4929]: E1002 13:24:35.256403 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18\": container with ID starting with 8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18 not found: ID does not exist" containerID="8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.256430 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18"} err="failed to get container status \"8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18\": rpc error: code = NotFound desc = could not find container \"8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18\": container with ID starting with 8e45f5fe272cd616165fac6e2717cd6c3ccf67e72d9da7e32c15176e53312d18 not found: ID does not exist" Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.456874 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 13:24:35 crc kubenswrapper[4929]: I1002 13:24:35.504466 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z57r9"] Oct 02 13:24:36 crc kubenswrapper[4929]: I1002 13:24:36.169007 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" path="/var/lib/kubelet/pods/94b01c90-c88b-4218-9287-e4f5df0e2677/volumes" Oct 02 13:24:44 crc kubenswrapper[4929]: I1002 13:24:44.736291 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:24:44 crc kubenswrapper[4929]: I1002 13:24:44.737068 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:25:14 crc kubenswrapper[4929]: I1002 13:25:14.736559 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:25:14 crc kubenswrapper[4929]: I1002 13:25:14.738118 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:25:44 crc kubenswrapper[4929]: I1002 13:25:44.737665 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:25:44 crc kubenswrapper[4929]: I1002 13:25:44.738602 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:25:44 crc kubenswrapper[4929]: I1002 13:25:44.738739 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:25:44 crc kubenswrapper[4929]: I1002 13:25:44.739691 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:25:44 crc kubenswrapper[4929]: I1002 13:25:44.739759 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" gracePeriod=600 Oct 02 13:25:44 crc kubenswrapper[4929]: E1002 13:25:44.877821 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:25:45 crc kubenswrapper[4929]: I1002 13:25:45.811263 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" exitCode=0 Oct 02 13:25:45 crc kubenswrapper[4929]: I1002 13:25:45.811344 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01"} Oct 02 13:25:45 crc kubenswrapper[4929]: I1002 13:25:45.811611 4929 scope.go:117] "RemoveContainer" containerID="21a8a6b45d1a4cae8908735413cb71002fff687b44ca377226351587dfe46ed8" Oct 02 13:25:45 crc kubenswrapper[4929]: I1002 13:25:45.813134 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:25:45 crc kubenswrapper[4929]: E1002 13:25:45.813634 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:25:58 crc kubenswrapper[4929]: I1002 13:25:58.157429 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:25:58 crc kubenswrapper[4929]: E1002 13:25:58.158357 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:26:12 crc kubenswrapper[4929]: I1002 13:26:12.157683 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:26:12 crc kubenswrapper[4929]: E1002 13:26:12.158523 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.426532 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:15 crc kubenswrapper[4929]: E1002 13:26:15.431532 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.431566 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4929]: E1002 13:26:15.431583 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="extract-utilities" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.431593 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="extract-utilities" Oct 02 13:26:15 crc kubenswrapper[4929]: E1002 13:26:15.431608 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="extract-content" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.431616 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="extract-content" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.431843 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b01c90-c88b-4218-9287-e4f5df0e2677" containerName="registry-server" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.433998 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.438052 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.504658 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.504726 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.504808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdl8m\" (UniqueName: \"kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.606741 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.606804 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.606900 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdl8m\" (UniqueName: \"kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.607378 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.607574 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.626101 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdl8m\" (UniqueName: \"kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m\") pod \"redhat-marketplace-nhmwh\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:15 crc kubenswrapper[4929]: I1002 13:26:15.765251 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:16 crc kubenswrapper[4929]: I1002 13:26:16.277808 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:17 crc kubenswrapper[4929]: I1002 13:26:17.139206 4929 generic.go:334] "Generic (PLEG): container finished" podID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerID="685ed4ef40b307c0934dc4f216e37e5f10c5b69ee8e05c836c88cb0bd7d5bf48" exitCode=0 Oct 02 13:26:17 crc kubenswrapper[4929]: I1002 13:26:17.139292 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerDied","Data":"685ed4ef40b307c0934dc4f216e37e5f10c5b69ee8e05c836c88cb0bd7d5bf48"} Oct 02 13:26:17 crc kubenswrapper[4929]: I1002 13:26:17.139592 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerStarted","Data":"570eee5ceef52bfc297da7f61a0e0af4aa9c396ca82a3d29c6522cafd9e8acf3"} Oct 02 13:26:18 crc kubenswrapper[4929]: I1002 13:26:18.173168 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerStarted","Data":"ca24cf8a7853eade963e2e22da3870f5f24f9a7902eba5bae40e351d1f75c792"} Oct 02 13:26:19 crc kubenswrapper[4929]: I1002 13:26:19.170503 4929 generic.go:334] "Generic (PLEG): container finished" podID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerID="ca24cf8a7853eade963e2e22da3870f5f24f9a7902eba5bae40e351d1f75c792" exitCode=0 Oct 02 13:26:19 crc kubenswrapper[4929]: I1002 13:26:19.170639 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerDied","Data":"ca24cf8a7853eade963e2e22da3870f5f24f9a7902eba5bae40e351d1f75c792"} Oct 02 13:26:20 crc kubenswrapper[4929]: I1002 13:26:20.181676 4929 generic.go:334] "Generic (PLEG): container finished" podID="54ed1de3-f724-4239-8d13-a33ca45c5d4b" containerID="8b55f29d695ddfa04b50e1979ca87ee7cb2dc39610e60a3905fb151f3110d3d4" exitCode=0 Oct 02 13:26:20 crc kubenswrapper[4929]: I1002 13:26:20.181785 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" event={"ID":"54ed1de3-f724-4239-8d13-a33ca45c5d4b","Type":"ContainerDied","Data":"8b55f29d695ddfa04b50e1979ca87ee7cb2dc39610e60a3905fb151f3110d3d4"} Oct 02 13:26:20 crc kubenswrapper[4929]: I1002 13:26:20.187581 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerStarted","Data":"bee305162059d5466bd20be26f0ec141dff2e19c478c5eb7edab3bce75eb408d"} Oct 02 13:26:20 crc kubenswrapper[4929]: I1002 13:26:20.221346 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nhmwh" podStartSLOduration=2.826040557 podStartE2EDuration="5.221324348s" podCreationTimestamp="2025-10-02 13:26:15 +0000 UTC" firstStartedPulling="2025-10-02 13:26:17.142552071 +0000 UTC m=+8177.692918435" lastFinishedPulling="2025-10-02 13:26:19.537835862 +0000 UTC m=+8180.088202226" observedRunningTime="2025-10-02 13:26:20.214678786 +0000 UTC m=+8180.765045150" watchObservedRunningTime="2025-10-02 13:26:20.221324348 +0000 UTC m=+8180.771690712" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.671400 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755088 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755179 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755249 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755269 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755323 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhpv\" (UniqueName: \"kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.755355 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0\") pod \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\" (UID: \"54ed1de3-f724-4239-8d13-a33ca45c5d4b\") " Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.763005 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph" (OuterVolumeSpecName: "ceph") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.763707 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv" (OuterVolumeSpecName: "kube-api-access-xdhpv") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "kube-api-access-xdhpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.764234 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.794342 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory" (OuterVolumeSpecName: "inventory") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.795382 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.795593 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "54ed1de3-f724-4239-8d13-a33ca45c5d4b" (UID: "54ed1de3-f724-4239-8d13-a33ca45c5d4b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858486 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858535 4929 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858549 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858557 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858566 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhpv\" (UniqueName: \"kubernetes.io/projected/54ed1de3-f724-4239-8d13-a33ca45c5d4b-kube-api-access-xdhpv\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:21 crc kubenswrapper[4929]: I1002 13:26:21.858576 4929 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/54ed1de3-f724-4239-8d13-a33ca45c5d4b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.207374 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" event={"ID":"54ed1de3-f724-4239-8d13-a33ca45c5d4b","Type":"ContainerDied","Data":"cc76d9ea62a506806057f58a4362d248b31dd5e1c126e7ec1241480ca53239ea"} Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.207850 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc76d9ea62a506806057f58a4362d248b31dd5e1c126e7ec1241480ca53239ea" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.207417 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-4jxh7" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.308326 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8z5c4"] Oct 02 13:26:22 crc kubenswrapper[4929]: E1002 13:26:22.308789 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ed1de3-f724-4239-8d13-a33ca45c5d4b" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.308804 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ed1de3-f724-4239-8d13-a33ca45c5d4b" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.309053 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ed1de3-f724-4239-8d13-a33ca45c5d4b" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.309842 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.321143 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.321416 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.321600 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.321778 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.321916 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.322579 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.322784 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.334802 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8z5c4"] Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.370730 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.370808 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.370868 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371009 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371048 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371072 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371108 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371208 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371587 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371684 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.371732 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.474817 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.474912 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.474977 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475064 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475111 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475147 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475186 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475219 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475298 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475350 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.475390 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.476797 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.477593 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.479306 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484409 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484679 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484714 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484755 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484679 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.484884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.485794 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.494328 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl\") pod \"nova-cell1-openstack-openstack-cell1-8z5c4\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:22 crc kubenswrapper[4929]: I1002 13:26:22.632127 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:26:23 crc kubenswrapper[4929]: I1002 13:26:23.157486 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:26:23 crc kubenswrapper[4929]: E1002 13:26:23.158607 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:26:23 crc kubenswrapper[4929]: I1002 13:26:23.273822 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-8z5c4"] Oct 02 13:26:24 crc kubenswrapper[4929]: I1002 13:26:24.228831 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" event={"ID":"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a","Type":"ContainerStarted","Data":"278a6ee4e7e721fff58201aca5b84d726a170b227f1a9905e64959cc152dbee5"} Oct 02 13:26:25 crc kubenswrapper[4929]: I1002 13:26:25.243011 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" event={"ID":"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a","Type":"ContainerStarted","Data":"f852c373621d1d0947fceb690e9bc72231b8615518911c4dbfcae8294e95d0a0"} Oct 02 13:26:25 crc kubenswrapper[4929]: I1002 13:26:25.264485 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" podStartSLOduration=2.563657451 podStartE2EDuration="3.264456135s" podCreationTimestamp="2025-10-02 13:26:22 +0000 UTC" firstStartedPulling="2025-10-02 13:26:23.284308849 +0000 UTC m=+8183.834675213" lastFinishedPulling="2025-10-02 13:26:23.985107533 +0000 UTC m=+8184.535473897" observedRunningTime="2025-10-02 13:26:25.258980157 +0000 UTC m=+8185.809346521" watchObservedRunningTime="2025-10-02 13:26:25.264456135 +0000 UTC m=+8185.814822499" Oct 02 13:26:25 crc kubenswrapper[4929]: I1002 13:26:25.766426 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:25 crc kubenswrapper[4929]: I1002 13:26:25.766529 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:25 crc kubenswrapper[4929]: I1002 13:26:25.814617 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:26 crc kubenswrapper[4929]: I1002 13:26:26.304266 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.211178 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.215256 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.233387 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.325399 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.325470 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmg7x\" (UniqueName: \"kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.325513 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.427879 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.428071 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmg7x\" (UniqueName: \"kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.428129 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.428478 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.428730 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.447850 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmg7x\" (UniqueName: \"kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x\") pod \"certified-operators-vw68g\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.555832 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.832896 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.835672 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.858856 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.956537 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.956643 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:29 crc kubenswrapper[4929]: I1002 13:26:29.956749 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkd6m\" (UniqueName: \"kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.059130 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.059266 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.059345 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkd6m\" (UniqueName: \"kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.059696 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.059750 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.076796 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkd6m\" (UniqueName: \"kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m\") pod \"community-operators-2nzgr\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.124453 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:30 crc kubenswrapper[4929]: W1002 13:26:30.126626 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a8977c_c72d_45f3_8847_1790579022a0.slice/crio-8c4cb06839c50f45bcaca72fc61cf2c22a538135f7d195cf684623b087bbb245 WatchSource:0}: Error finding container 8c4cb06839c50f45bcaca72fc61cf2c22a538135f7d195cf684623b087bbb245: Status 404 returned error can't find the container with id 8c4cb06839c50f45bcaca72fc61cf2c22a538135f7d195cf684623b087bbb245 Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.189630 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.304348 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerStarted","Data":"8c4cb06839c50f45bcaca72fc61cf2c22a538135f7d195cf684623b087bbb245"} Oct 02 13:26:30 crc kubenswrapper[4929]: I1002 13:26:30.731379 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:31 crc kubenswrapper[4929]: I1002 13:26:31.317257 4929 generic.go:334] "Generic (PLEG): container finished" podID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerID="465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea" exitCode=0 Oct 02 13:26:31 crc kubenswrapper[4929]: I1002 13:26:31.317441 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerDied","Data":"465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea"} Oct 02 13:26:31 crc kubenswrapper[4929]: I1002 13:26:31.317600 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerStarted","Data":"b71593facafe63e8f7484c89aa9f0458e33437514354eb81f93493c71c62e15f"} Oct 02 13:26:31 crc kubenswrapper[4929]: I1002 13:26:31.319816 4929 generic.go:334] "Generic (PLEG): container finished" podID="22a8977c-c72d-45f3-8847-1790579022a0" containerID="d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1" exitCode=0 Oct 02 13:26:31 crc kubenswrapper[4929]: I1002 13:26:31.319867 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerDied","Data":"d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1"} Oct 02 13:26:32 crc kubenswrapper[4929]: I1002 13:26:32.334297 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerStarted","Data":"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f"} Oct 02 13:26:33 crc kubenswrapper[4929]: I1002 13:26:33.349191 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerStarted","Data":"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45"} Oct 02 13:26:34 crc kubenswrapper[4929]: I1002 13:26:34.001158 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:34 crc kubenswrapper[4929]: I1002 13:26:34.001752 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nhmwh" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="registry-server" containerID="cri-o://bee305162059d5466bd20be26f0ec141dff2e19c478c5eb7edab3bce75eb408d" gracePeriod=2 Oct 02 13:26:34 crc kubenswrapper[4929]: I1002 13:26:34.363089 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerDied","Data":"bee305162059d5466bd20be26f0ec141dff2e19c478c5eb7edab3bce75eb408d"} Oct 02 13:26:34 crc kubenswrapper[4929]: I1002 13:26:34.363185 4929 generic.go:334] "Generic (PLEG): container finished" podID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerID="bee305162059d5466bd20be26f0ec141dff2e19c478c5eb7edab3bce75eb408d" exitCode=0 Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.157837 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:26:35 crc kubenswrapper[4929]: E1002 13:26:35.159090 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.303298 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.379326 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhmwh" event={"ID":"a021ef81-1d0c-4fd0-838e-89c7bb921fc7","Type":"ContainerDied","Data":"570eee5ceef52bfc297da7f61a0e0af4aa9c396ca82a3d29c6522cafd9e8acf3"} Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.380371 4929 scope.go:117] "RemoveContainer" containerID="bee305162059d5466bd20be26f0ec141dff2e19c478c5eb7edab3bce75eb408d" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.379840 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhmwh" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.382817 4929 generic.go:334] "Generic (PLEG): container finished" podID="22a8977c-c72d-45f3-8847-1790579022a0" containerID="edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45" exitCode=0 Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.383047 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerDied","Data":"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45"} Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.387758 4929 generic.go:334] "Generic (PLEG): container finished" podID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerID="aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f" exitCode=0 Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.387869 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerDied","Data":"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f"} Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.396411 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdl8m\" (UniqueName: \"kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m\") pod \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.396503 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content\") pod \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.396621 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities\") pod \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\" (UID: \"a021ef81-1d0c-4fd0-838e-89c7bb921fc7\") " Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.397561 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities" (OuterVolumeSpecName: "utilities") pod "a021ef81-1d0c-4fd0-838e-89c7bb921fc7" (UID: "a021ef81-1d0c-4fd0-838e-89c7bb921fc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.406668 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m" (OuterVolumeSpecName: "kube-api-access-cdl8m") pod "a021ef81-1d0c-4fd0-838e-89c7bb921fc7" (UID: "a021ef81-1d0c-4fd0-838e-89c7bb921fc7"). InnerVolumeSpecName "kube-api-access-cdl8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.412093 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a021ef81-1d0c-4fd0-838e-89c7bb921fc7" (UID: "a021ef81-1d0c-4fd0-838e-89c7bb921fc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.412476 4929 scope.go:117] "RemoveContainer" containerID="ca24cf8a7853eade963e2e22da3870f5f24f9a7902eba5bae40e351d1f75c792" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.485145 4929 scope.go:117] "RemoveContainer" containerID="685ed4ef40b307c0934dc4f216e37e5f10c5b69ee8e05c836c88cb0bd7d5bf48" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.499707 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdl8m\" (UniqueName: \"kubernetes.io/projected/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-kube-api-access-cdl8m\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.499748 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.499759 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a021ef81-1d0c-4fd0-838e-89c7bb921fc7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.718660 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:35 crc kubenswrapper[4929]: I1002 13:26:35.731348 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhmwh"] Oct 02 13:26:36 crc kubenswrapper[4929]: I1002 13:26:36.168434 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" path="/var/lib/kubelet/pods/a021ef81-1d0c-4fd0-838e-89c7bb921fc7/volumes" Oct 02 13:26:36 crc kubenswrapper[4929]: I1002 13:26:36.400089 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerStarted","Data":"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b"} Oct 02 13:26:36 crc kubenswrapper[4929]: I1002 13:26:36.404634 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerStarted","Data":"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6"} Oct 02 13:26:36 crc kubenswrapper[4929]: I1002 13:26:36.428389 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2nzgr" podStartSLOduration=2.946749597 podStartE2EDuration="7.428369547s" podCreationTimestamp="2025-10-02 13:26:29 +0000 UTC" firstStartedPulling="2025-10-02 13:26:31.31972575 +0000 UTC m=+8191.870092114" lastFinishedPulling="2025-10-02 13:26:35.8013457 +0000 UTC m=+8196.351712064" observedRunningTime="2025-10-02 13:26:36.418132262 +0000 UTC m=+8196.968498636" watchObservedRunningTime="2025-10-02 13:26:36.428369547 +0000 UTC m=+8196.978735911" Oct 02 13:26:36 crc kubenswrapper[4929]: I1002 13:26:36.445089 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vw68g" podStartSLOduration=2.922352942 podStartE2EDuration="7.445069179s" podCreationTimestamp="2025-10-02 13:26:29 +0000 UTC" firstStartedPulling="2025-10-02 13:26:31.321348056 +0000 UTC m=+8191.871714420" lastFinishedPulling="2025-10-02 13:26:35.844064293 +0000 UTC m=+8196.394430657" observedRunningTime="2025-10-02 13:26:36.438109388 +0000 UTC m=+8196.988475762" watchObservedRunningTime="2025-10-02 13:26:36.445069179 +0000 UTC m=+8196.995435543" Oct 02 13:26:39 crc kubenswrapper[4929]: I1002 13:26:39.556816 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:39 crc kubenswrapper[4929]: I1002 13:26:39.557140 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:39 crc kubenswrapper[4929]: I1002 13:26:39.606656 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:40 crc kubenswrapper[4929]: I1002 13:26:40.189927 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:40 crc kubenswrapper[4929]: I1002 13:26:40.190086 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:40 crc kubenswrapper[4929]: I1002 13:26:40.241456 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:48 crc kubenswrapper[4929]: I1002 13:26:48.157261 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:26:48 crc kubenswrapper[4929]: E1002 13:26:48.158175 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:26:49 crc kubenswrapper[4929]: I1002 13:26:49.611739 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:49 crc kubenswrapper[4929]: I1002 13:26:49.708059 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:50 crc kubenswrapper[4929]: I1002 13:26:50.238359 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:50 crc kubenswrapper[4929]: I1002 13:26:50.551210 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vw68g" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="registry-server" containerID="cri-o://a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6" gracePeriod=2 Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.158314 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.268338 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content\") pod \"22a8977c-c72d-45f3-8847-1790579022a0\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.268422 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmg7x\" (UniqueName: \"kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x\") pod \"22a8977c-c72d-45f3-8847-1790579022a0\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.268447 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities\") pod \"22a8977c-c72d-45f3-8847-1790579022a0\" (UID: \"22a8977c-c72d-45f3-8847-1790579022a0\") " Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.272076 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities" (OuterVolumeSpecName: "utilities") pod "22a8977c-c72d-45f3-8847-1790579022a0" (UID: "22a8977c-c72d-45f3-8847-1790579022a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.274906 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x" (OuterVolumeSpecName: "kube-api-access-tmg7x") pod "22a8977c-c72d-45f3-8847-1790579022a0" (UID: "22a8977c-c72d-45f3-8847-1790579022a0"). InnerVolumeSpecName "kube-api-access-tmg7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.330305 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22a8977c-c72d-45f3-8847-1790579022a0" (UID: "22a8977c-c72d-45f3-8847-1790579022a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.370511 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.370548 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmg7x\" (UniqueName: \"kubernetes.io/projected/22a8977c-c72d-45f3-8847-1790579022a0-kube-api-access-tmg7x\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.370559 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22a8977c-c72d-45f3-8847-1790579022a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.566348 4929 generic.go:334] "Generic (PLEG): container finished" podID="22a8977c-c72d-45f3-8847-1790579022a0" containerID="a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6" exitCode=0 Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.566443 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw68g" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.566471 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerDied","Data":"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6"} Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.567016 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw68g" event={"ID":"22a8977c-c72d-45f3-8847-1790579022a0","Type":"ContainerDied","Data":"8c4cb06839c50f45bcaca72fc61cf2c22a538135f7d195cf684623b087bbb245"} Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.567036 4929 scope.go:117] "RemoveContainer" containerID="a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.616577 4929 scope.go:117] "RemoveContainer" containerID="edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.620438 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.632457 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vw68g"] Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.658028 4929 scope.go:117] "RemoveContainer" containerID="d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.702001 4929 scope.go:117] "RemoveContainer" containerID="a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6" Oct 02 13:26:51 crc kubenswrapper[4929]: E1002 13:26:51.702770 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6\": container with ID starting with a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6 not found: ID does not exist" containerID="a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.702827 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6"} err="failed to get container status \"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6\": rpc error: code = NotFound desc = could not find container \"a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6\": container with ID starting with a06d9b0a93e06dd0b5cb7900d2eece6e1a54641125af92947a5a74a97e4606d6 not found: ID does not exist" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.702858 4929 scope.go:117] "RemoveContainer" containerID="edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45" Oct 02 13:26:51 crc kubenswrapper[4929]: E1002 13:26:51.703195 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45\": container with ID starting with edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45 not found: ID does not exist" containerID="edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.703232 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45"} err="failed to get container status \"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45\": rpc error: code = NotFound desc = could not find container \"edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45\": container with ID starting with edd0ea361764f7d49b7e7cf23666aed6b41a8d51f2b88a853827ccf9d3ff4f45 not found: ID does not exist" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.703251 4929 scope.go:117] "RemoveContainer" containerID="d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1" Oct 02 13:26:51 crc kubenswrapper[4929]: E1002 13:26:51.703527 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1\": container with ID starting with d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1 not found: ID does not exist" containerID="d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1" Oct 02 13:26:51 crc kubenswrapper[4929]: I1002 13:26:51.703565 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1"} err="failed to get container status \"d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1\": rpc error: code = NotFound desc = could not find container \"d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1\": container with ID starting with d6366840af0b6f541bc72252544701ab81517091463c3b3f0b3011ce9b236ab1 not found: ID does not exist" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.001606 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.001981 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2nzgr" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="registry-server" containerID="cri-o://2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b" gracePeriod=2 Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.174161 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a8977c-c72d-45f3-8847-1790579022a0" path="/var/lib/kubelet/pods/22a8977c-c72d-45f3-8847-1790579022a0/volumes" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.540644 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.595634 4929 generic.go:334] "Generic (PLEG): container finished" podID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerID="2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b" exitCode=0 Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.595727 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerDied","Data":"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b"} Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.595799 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2nzgr" event={"ID":"16ec78ed-20d1-42ee-8538-fea19958d2dd","Type":"ContainerDied","Data":"b71593facafe63e8f7484c89aa9f0458e33437514354eb81f93493c71c62e15f"} Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.595832 4929 scope.go:117] "RemoveContainer" containerID="2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.596173 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2nzgr" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.599141 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkd6m\" (UniqueName: \"kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m\") pod \"16ec78ed-20d1-42ee-8538-fea19958d2dd\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.599201 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content\") pod \"16ec78ed-20d1-42ee-8538-fea19958d2dd\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.599721 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities\") pod \"16ec78ed-20d1-42ee-8538-fea19958d2dd\" (UID: \"16ec78ed-20d1-42ee-8538-fea19958d2dd\") " Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.601254 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities" (OuterVolumeSpecName: "utilities") pod "16ec78ed-20d1-42ee-8538-fea19958d2dd" (UID: "16ec78ed-20d1-42ee-8538-fea19958d2dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.605917 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.613841 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m" (OuterVolumeSpecName: "kube-api-access-zkd6m") pod "16ec78ed-20d1-42ee-8538-fea19958d2dd" (UID: "16ec78ed-20d1-42ee-8538-fea19958d2dd"). InnerVolumeSpecName "kube-api-access-zkd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.630302 4929 scope.go:117] "RemoveContainer" containerID="aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.660605 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16ec78ed-20d1-42ee-8538-fea19958d2dd" (UID: "16ec78ed-20d1-42ee-8538-fea19958d2dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.697693 4929 scope.go:117] "RemoveContainer" containerID="465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.708934 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkd6m\" (UniqueName: \"kubernetes.io/projected/16ec78ed-20d1-42ee-8538-fea19958d2dd-kube-api-access-zkd6m\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.709002 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16ec78ed-20d1-42ee-8538-fea19958d2dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.759658 4929 scope.go:117] "RemoveContainer" containerID="2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b" Oct 02 13:26:52 crc kubenswrapper[4929]: E1002 13:26:52.760668 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b\": container with ID starting with 2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b not found: ID does not exist" containerID="2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.760721 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b"} err="failed to get container status \"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b\": rpc error: code = NotFound desc = could not find container \"2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b\": container with ID starting with 2b6a5a4650d4b500fdf79dbb53403f25affaf21b3e9936af5f4c85cef269034b not found: ID does not exist" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.760768 4929 scope.go:117] "RemoveContainer" containerID="aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f" Oct 02 13:26:52 crc kubenswrapper[4929]: E1002 13:26:52.761167 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f\": container with ID starting with aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f not found: ID does not exist" containerID="aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.761194 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f"} err="failed to get container status \"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f\": rpc error: code = NotFound desc = could not find container \"aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f\": container with ID starting with aab08ea7817164a9fb05e1fea8a9449050691f82706451b8eda4e4adcecba25f not found: ID does not exist" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.761207 4929 scope.go:117] "RemoveContainer" containerID="465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea" Oct 02 13:26:52 crc kubenswrapper[4929]: E1002 13:26:52.761588 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea\": container with ID starting with 465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea not found: ID does not exist" containerID="465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.761668 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea"} err="failed to get container status \"465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea\": rpc error: code = NotFound desc = could not find container \"465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea\": container with ID starting with 465b1c6d9cc362e7a51aa96dc9edbdc483f9f9da2d0c125352a85a17a48d74ea not found: ID does not exist" Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.937864 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:52 crc kubenswrapper[4929]: I1002 13:26:52.946251 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2nzgr"] Oct 02 13:26:54 crc kubenswrapper[4929]: I1002 13:26:54.175120 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" path="/var/lib/kubelet/pods/16ec78ed-20d1-42ee-8538-fea19958d2dd/volumes" Oct 02 13:27:01 crc kubenswrapper[4929]: I1002 13:27:01.157246 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:27:01 crc kubenswrapper[4929]: E1002 13:27:01.157976 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:27:14 crc kubenswrapper[4929]: I1002 13:27:14.157206 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:27:14 crc kubenswrapper[4929]: E1002 13:27:14.158025 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:27:26 crc kubenswrapper[4929]: I1002 13:27:26.156613 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:27:26 crc kubenswrapper[4929]: E1002 13:27:26.157914 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:27:38 crc kubenswrapper[4929]: I1002 13:27:38.157430 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:27:38 crc kubenswrapper[4929]: E1002 13:27:38.158330 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:27:52 crc kubenswrapper[4929]: I1002 13:27:52.157439 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:27:52 crc kubenswrapper[4929]: E1002 13:27:52.158411 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:28:04 crc kubenswrapper[4929]: I1002 13:28:04.156851 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:28:04 crc kubenswrapper[4929]: E1002 13:28:04.157765 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:28:16 crc kubenswrapper[4929]: I1002 13:28:16.156794 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:28:16 crc kubenswrapper[4929]: E1002 13:28:16.157689 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:28:31 crc kubenswrapper[4929]: I1002 13:28:31.157007 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:28:31 crc kubenswrapper[4929]: E1002 13:28:31.158060 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:28:43 crc kubenswrapper[4929]: I1002 13:28:43.156890 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:28:43 crc kubenswrapper[4929]: E1002 13:28:43.157782 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:28:58 crc kubenswrapper[4929]: I1002 13:28:58.158404 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:28:58 crc kubenswrapper[4929]: E1002 13:28:58.160101 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:29:09 crc kubenswrapper[4929]: I1002 13:29:09.157295 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:29:09 crc kubenswrapper[4929]: E1002 13:29:09.158089 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:29:20 crc kubenswrapper[4929]: I1002 13:29:20.163918 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:29:20 crc kubenswrapper[4929]: E1002 13:29:20.164785 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:29:34 crc kubenswrapper[4929]: I1002 13:29:34.157765 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:29:34 crc kubenswrapper[4929]: E1002 13:29:34.158722 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:29:48 crc kubenswrapper[4929]: I1002 13:29:48.156443 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:29:48 crc kubenswrapper[4929]: E1002 13:29:48.157208 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.173532 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz"] Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174561 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174583 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174598 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174606 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174627 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174633 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174650 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174656 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="extract-content" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174671 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174679 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174698 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174704 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174715 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174720 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174736 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174741 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: E1002 13:30:00.174758 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.174764 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="extract-utilities" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.175009 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a021ef81-1d0c-4fd0-838e-89c7bb921fc7" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.175032 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a8977c-c72d-45f3-8847-1790579022a0" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.175047 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ec78ed-20d1-42ee-8538-fea19958d2dd" containerName="registry-server" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.177432 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.179546 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.180732 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.181227 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz"] Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.265484 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.266125 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.266263 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4j4m\" (UniqueName: \"kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.368225 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.368301 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4j4m\" (UniqueName: \"kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.368383 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.369744 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.374143 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.383263 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4j4m\" (UniqueName: \"kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m\") pod \"collect-profiles-29323530-rjrhz\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.505435 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:00 crc kubenswrapper[4929]: I1002 13:30:00.994429 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz"] Oct 02 13:30:01 crc kubenswrapper[4929]: I1002 13:30:01.611259 4929 generic.go:334] "Generic (PLEG): container finished" podID="0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" containerID="f852c373621d1d0947fceb690e9bc72231b8615518911c4dbfcae8294e95d0a0" exitCode=0 Oct 02 13:30:01 crc kubenswrapper[4929]: I1002 13:30:01.611333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" event={"ID":"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a","Type":"ContainerDied","Data":"f852c373621d1d0947fceb690e9bc72231b8615518911c4dbfcae8294e95d0a0"} Oct 02 13:30:01 crc kubenswrapper[4929]: I1002 13:30:01.614736 4929 generic.go:334] "Generic (PLEG): container finished" podID="bfdcc44b-74c6-4671-aed9-496a0adc23c3" containerID="0040bd03f8a17c3ab16974a2ec358286022dcafde614a28bf8e79dbdc8fc3085" exitCode=0 Oct 02 13:30:01 crc kubenswrapper[4929]: I1002 13:30:01.614771 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" event={"ID":"bfdcc44b-74c6-4671-aed9-496a0adc23c3","Type":"ContainerDied","Data":"0040bd03f8a17c3ab16974a2ec358286022dcafde614a28bf8e79dbdc8fc3085"} Oct 02 13:30:01 crc kubenswrapper[4929]: I1002 13:30:01.614796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" event={"ID":"bfdcc44b-74c6-4671-aed9-496a0adc23c3","Type":"ContainerStarted","Data":"f4fccd1391bf95435a14032ee639ebcc8fe491f5f563770991489d9686e79469"} Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.127651 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.133516 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.160147 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:30:03 crc kubenswrapper[4929]: E1002 13:30:03.170686 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229741 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229811 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229852 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229872 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume\") pod \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229890 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229926 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229944 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.229975 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4j4m\" (UniqueName: \"kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m\") pod \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230016 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230042 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230077 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume\") pod \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\" (UID: \"bfdcc44b-74c6-4671-aed9-496a0adc23c3\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230095 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230114 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.230141 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl\") pod \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\" (UID: \"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a\") " Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.232329 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfdcc44b-74c6-4671-aed9-496a0adc23c3" (UID: "bfdcc44b-74c6-4671-aed9-496a0adc23c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.236159 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph" (OuterVolumeSpecName: "ceph") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.238105 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m" (OuterVolumeSpecName: "kube-api-access-b4j4m") pod "bfdcc44b-74c6-4671-aed9-496a0adc23c3" (UID: "bfdcc44b-74c6-4671-aed9-496a0adc23c3"). InnerVolumeSpecName "kube-api-access-b4j4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.238313 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfdcc44b-74c6-4671-aed9-496a0adc23c3" (UID: "bfdcc44b-74c6-4671-aed9-496a0adc23c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.239273 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl" (OuterVolumeSpecName: "kube-api-access-vhhfl") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "kube-api-access-vhhfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.240881 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.259174 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.261261 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.266909 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.267318 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.272496 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.273029 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.274190 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.276091 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory" (OuterVolumeSpecName: "inventory") pod "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" (UID: "0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332061 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332113 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332129 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdcc44b-74c6-4671-aed9-496a0adc23c3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332144 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332158 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332170 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-kube-api-access-vhhfl\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332183 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332197 4929 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332214 4929 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332230 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfdcc44b-74c6-4671-aed9-496a0adc23c3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332248 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332263 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332284 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.332301 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4j4m\" (UniqueName: \"kubernetes.io/projected/bfdcc44b-74c6-4671-aed9-496a0adc23c3-kube-api-access-b4j4m\") on node \"crc\" DevicePath \"\"" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.632830 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.632827 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323530-rjrhz" event={"ID":"bfdcc44b-74c6-4671-aed9-496a0adc23c3","Type":"ContainerDied","Data":"f4fccd1391bf95435a14032ee639ebcc8fe491f5f563770991489d9686e79469"} Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.632984 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4fccd1391bf95435a14032ee639ebcc8fe491f5f563770991489d9686e79469" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.634668 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" event={"ID":"0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a","Type":"ContainerDied","Data":"278a6ee4e7e721fff58201aca5b84d726a170b227f1a9905e64959cc152dbee5"} Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.634705 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278a6ee4e7e721fff58201aca5b84d726a170b227f1a9905e64959cc152dbee5" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.634763 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-8z5c4" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.728832 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-m9k6m"] Oct 02 13:30:03 crc kubenswrapper[4929]: E1002 13:30:03.729514 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.729558 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:30:03 crc kubenswrapper[4929]: E1002 13:30:03.729627 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdcc44b-74c6-4671-aed9-496a0adc23c3" containerName="collect-profiles" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.729644 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdcc44b-74c6-4671-aed9-496a0adc23c3" containerName="collect-profiles" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.729985 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.730024 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdcc44b-74c6-4671-aed9-496a0adc23c3" containerName="collect-profiles" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.731082 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.734364 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.734379 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.734592 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.734645 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.734794 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740240 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740323 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740355 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740428 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtdct\" (UniqueName: \"kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740501 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740542 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740567 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740607 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.740743 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-m9k6m"] Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.842738 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.842878 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.842917 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.842935 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.843010 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtdct\" (UniqueName: \"kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.843069 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.843096 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.843115 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.846885 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.846994 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.846884 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.848416 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.848525 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.848604 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.849261 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:03 crc kubenswrapper[4929]: I1002 13:30:03.861629 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtdct\" (UniqueName: \"kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct\") pod \"telemetry-openstack-openstack-cell1-m9k6m\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.053841 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.223610 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm"] Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.233015 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-xtjkm"] Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.609893 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-m9k6m"] Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.618926 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:30:04 crc kubenswrapper[4929]: I1002 13:30:04.645153 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" event={"ID":"d19e27bb-e8ff-410d-a0cb-be48e389a20c","Type":"ContainerStarted","Data":"18b89e42b723038dd566cfb2c307ca574602716b1f2fecc2249a9f4a437848bb"} Oct 02 13:30:06 crc kubenswrapper[4929]: I1002 13:30:06.172169 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c46c1f-700a-45a7-ab95-42782085e0ed" path="/var/lib/kubelet/pods/76c46c1f-700a-45a7-ab95-42782085e0ed/volumes" Oct 02 13:30:06 crc kubenswrapper[4929]: I1002 13:30:06.668102 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" event={"ID":"d19e27bb-e8ff-410d-a0cb-be48e389a20c","Type":"ContainerStarted","Data":"d742855011906bd0561dd22e62687148a2a6a475e6db5efae03fb36fa321a315"} Oct 02 13:30:06 crc kubenswrapper[4929]: I1002 13:30:06.693682 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" podStartSLOduration=2.634768922 podStartE2EDuration="3.693662808s" podCreationTimestamp="2025-10-02 13:30:03 +0000 UTC" firstStartedPulling="2025-10-02 13:30:04.61870554 +0000 UTC m=+8405.169071904" lastFinishedPulling="2025-10-02 13:30:05.677599426 +0000 UTC m=+8406.227965790" observedRunningTime="2025-10-02 13:30:06.68577784 +0000 UTC m=+8407.236144204" watchObservedRunningTime="2025-10-02 13:30:06.693662808 +0000 UTC m=+8407.244029172" Oct 02 13:30:14 crc kubenswrapper[4929]: I1002 13:30:14.158337 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:30:14 crc kubenswrapper[4929]: E1002 13:30:14.160451 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:30:28 crc kubenswrapper[4929]: I1002 13:30:28.157173 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:30:28 crc kubenswrapper[4929]: E1002 13:30:28.157868 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:30:28 crc kubenswrapper[4929]: I1002 13:30:28.832606 4929 scope.go:117] "RemoveContainer" containerID="9cec864a1d9a9f1dd02769dd89c1de8cca7c45b7a4d6ce3eebecbb83b39fb157" Oct 02 13:30:42 crc kubenswrapper[4929]: I1002 13:30:42.157033 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:30:42 crc kubenswrapper[4929]: E1002 13:30:42.157852 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:30:56 crc kubenswrapper[4929]: I1002 13:30:56.157203 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:30:57 crc kubenswrapper[4929]: I1002 13:30:57.195826 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4"} Oct 02 13:33:14 crc kubenswrapper[4929]: I1002 13:33:14.736653 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:33:14 crc kubenswrapper[4929]: I1002 13:33:14.737285 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:33:44 crc kubenswrapper[4929]: I1002 13:33:44.737238 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:33:44 crc kubenswrapper[4929]: I1002 13:33:44.737719 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:34:11 crc kubenswrapper[4929]: I1002 13:34:11.128798 4929 generic.go:334] "Generic (PLEG): container finished" podID="d19e27bb-e8ff-410d-a0cb-be48e389a20c" containerID="d742855011906bd0561dd22e62687148a2a6a475e6db5efae03fb36fa321a315" exitCode=0 Oct 02 13:34:11 crc kubenswrapper[4929]: I1002 13:34:11.129214 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" event={"ID":"d19e27bb-e8ff-410d-a0cb-be48e389a20c","Type":"ContainerDied","Data":"d742855011906bd0561dd22e62687148a2a6a475e6db5efae03fb36fa321a315"} Oct 02 13:34:12 crc kubenswrapper[4929]: I1002 13:34:12.908126 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.057985 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.058118 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.058148 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtdct\" (UniqueName: \"kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.058176 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.058209 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.058245 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.059855 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.060037 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0\") pod \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\" (UID: \"d19e27bb-e8ff-410d-a0cb-be48e389a20c\") " Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.066129 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.080688 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph" (OuterVolumeSpecName: "ceph") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.081117 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct" (OuterVolumeSpecName: "kube-api-access-xtdct") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "kube-api-access-xtdct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.091500 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.092556 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.094002 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.094803 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.100937 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory" (OuterVolumeSpecName: "inventory") pod "d19e27bb-e8ff-410d-a0cb-be48e389a20c" (UID: "d19e27bb-e8ff-410d-a0cb-be48e389a20c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.152796 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" event={"ID":"d19e27bb-e8ff-410d-a0cb-be48e389a20c","Type":"ContainerDied","Data":"18b89e42b723038dd566cfb2c307ca574602716b1f2fecc2249a9f4a437848bb"} Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.153196 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b89e42b723038dd566cfb2c307ca574602716b1f2fecc2249a9f4a437848bb" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.153092 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-m9k6m" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.161954 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162004 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162021 4929 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162033 4929 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162045 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtdct\" (UniqueName: \"kubernetes.io/projected/d19e27bb-e8ff-410d-a0cb-be48e389a20c-kube-api-access-xtdct\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162058 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162068 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.162091 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e27bb-e8ff-410d-a0cb-be48e389a20c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.302418 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gq6bf"] Oct 02 13:34:13 crc kubenswrapper[4929]: E1002 13:34:13.302995 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19e27bb-e8ff-410d-a0cb-be48e389a20c" containerName="telemetry-openstack-openstack-cell1" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.303019 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19e27bb-e8ff-410d-a0cb-be48e389a20c" containerName="telemetry-openstack-openstack-cell1" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.303300 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19e27bb-e8ff-410d-a0cb-be48e389a20c" containerName="telemetry-openstack-openstack-cell1" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.304296 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.310422 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.310640 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.310865 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.311026 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.311353 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.317117 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gq6bf"] Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.470658 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m957g\" (UniqueName: \"kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.470801 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.471439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.471774 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.472337 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.472848 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575020 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575077 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m957g\" (UniqueName: \"kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575120 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575145 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575205 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.575286 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.579532 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.579934 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.580249 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.580742 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.583172 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.594491 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m957g\" (UniqueName: \"kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g\") pod \"neutron-sriov-openstack-openstack-cell1-gq6bf\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:13 crc kubenswrapper[4929]: I1002 13:34:13.644915 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:34:14 crc kubenswrapper[4929]: I1002 13:34:14.179922 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gq6bf"] Oct 02 13:34:14 crc kubenswrapper[4929]: I1002 13:34:14.737071 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:34:14 crc kubenswrapper[4929]: I1002 13:34:14.737462 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:34:14 crc kubenswrapper[4929]: I1002 13:34:14.737523 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:34:15 crc kubenswrapper[4929]: I1002 13:34:15.177135 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" event={"ID":"e7bef8fe-3f7d-4798-b217-76996aab4a9f","Type":"ContainerStarted","Data":"baf0bdd48b3ffb2cd62cc61c26b47ce038a2150571346192dc2b88a8f1a4218a"} Oct 02 13:34:15 crc kubenswrapper[4929]: I1002 13:34:15.177493 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" event={"ID":"e7bef8fe-3f7d-4798-b217-76996aab4a9f","Type":"ContainerStarted","Data":"6735721daa32c47081db22d22d4ebf75e191807597a7fe73f4b61eae06aaf58e"} Oct 02 13:34:15 crc kubenswrapper[4929]: I1002 13:34:15.177841 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:34:15 crc kubenswrapper[4929]: I1002 13:34:15.177927 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4" gracePeriod=600 Oct 02 13:34:15 crc kubenswrapper[4929]: I1002 13:34:15.206947 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" podStartSLOduration=1.788353999 podStartE2EDuration="2.206924632s" podCreationTimestamp="2025-10-02 13:34:13 +0000 UTC" firstStartedPulling="2025-10-02 13:34:14.187018557 +0000 UTC m=+8654.737384921" lastFinishedPulling="2025-10-02 13:34:14.60558919 +0000 UTC m=+8655.155955554" observedRunningTime="2025-10-02 13:34:15.196546481 +0000 UTC m=+8655.746912845" watchObservedRunningTime="2025-10-02 13:34:15.206924632 +0000 UTC m=+8655.757291006" Oct 02 13:34:16 crc kubenswrapper[4929]: I1002 13:34:16.187361 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4" exitCode=0 Oct 02 13:34:16 crc kubenswrapper[4929]: I1002 13:34:16.187420 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4"} Oct 02 13:34:16 crc kubenswrapper[4929]: I1002 13:34:16.188004 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73"} Oct 02 13:34:16 crc kubenswrapper[4929]: I1002 13:34:16.188025 4929 scope.go:117] "RemoveContainer" containerID="b0415709a2de1c67d10e6d9883538f0bb94ded51095f6eece71a60b4f469fa01" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.115033 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.169546 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.187281 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.255561 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.255715 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwxb\" (UniqueName: \"kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.256001 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.357783 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwxb\" (UniqueName: \"kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.358080 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.358239 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.358772 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.358804 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.405364 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwxb\" (UniqueName: \"kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb\") pod \"redhat-operators-frt6x\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:39 crc kubenswrapper[4929]: I1002 13:34:39.503187 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:40 crc kubenswrapper[4929]: I1002 13:34:40.069181 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:34:40 crc kubenswrapper[4929]: I1002 13:34:40.433101 4929 generic.go:334] "Generic (PLEG): container finished" podID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerID="61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07" exitCode=0 Oct 02 13:34:40 crc kubenswrapper[4929]: I1002 13:34:40.433153 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerDied","Data":"61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07"} Oct 02 13:34:40 crc kubenswrapper[4929]: I1002 13:34:40.433182 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerStarted","Data":"00a2275bf21fd760b914fe86f4edcfa75423a1b76e0b6d581cf241b711b15d08"} Oct 02 13:34:42 crc kubenswrapper[4929]: I1002 13:34:42.454636 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerStarted","Data":"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367"} Oct 02 13:34:46 crc kubenswrapper[4929]: I1002 13:34:46.511172 4929 generic.go:334] "Generic (PLEG): container finished" podID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerID="5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367" exitCode=0 Oct 02 13:34:46 crc kubenswrapper[4929]: I1002 13:34:46.511232 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerDied","Data":"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367"} Oct 02 13:34:47 crc kubenswrapper[4929]: I1002 13:34:47.549668 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerStarted","Data":"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3"} Oct 02 13:34:47 crc kubenswrapper[4929]: I1002 13:34:47.575118 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-frt6x" podStartSLOduration=2.114225961 podStartE2EDuration="8.575099368s" podCreationTimestamp="2025-10-02 13:34:39 +0000 UTC" firstStartedPulling="2025-10-02 13:34:40.435441684 +0000 UTC m=+8680.985808048" lastFinishedPulling="2025-10-02 13:34:46.896315091 +0000 UTC m=+8687.446681455" observedRunningTime="2025-10-02 13:34:47.56688106 +0000 UTC m=+8688.117247434" watchObservedRunningTime="2025-10-02 13:34:47.575099368 +0000 UTC m=+8688.125465732" Oct 02 13:34:49 crc kubenswrapper[4929]: I1002 13:34:49.504838 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:49 crc kubenswrapper[4929]: I1002 13:34:49.505388 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:50 crc kubenswrapper[4929]: I1002 13:34:50.552916 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frt6x" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="registry-server" probeResult="failure" output=< Oct 02 13:34:50 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:34:50 crc kubenswrapper[4929]: > Oct 02 13:34:59 crc kubenswrapper[4929]: I1002 13:34:59.551842 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:59 crc kubenswrapper[4929]: I1002 13:34:59.605831 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:34:59 crc kubenswrapper[4929]: I1002 13:34:59.787198 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:35:00 crc kubenswrapper[4929]: I1002 13:35:00.697259 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-frt6x" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="registry-server" containerID="cri-o://52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3" gracePeriod=2 Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.194508 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.238110 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities\") pod \"3aadaf85-3be1-475f-aefd-ffa93549afc7\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.238164 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content\") pod \"3aadaf85-3be1-475f-aefd-ffa93549afc7\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.238345 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkwxb\" (UniqueName: \"kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb\") pod \"3aadaf85-3be1-475f-aefd-ffa93549afc7\" (UID: \"3aadaf85-3be1-475f-aefd-ffa93549afc7\") " Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.240889 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities" (OuterVolumeSpecName: "utilities") pod "3aadaf85-3be1-475f-aefd-ffa93549afc7" (UID: "3aadaf85-3be1-475f-aefd-ffa93549afc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.246157 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb" (OuterVolumeSpecName: "kube-api-access-wkwxb") pod "3aadaf85-3be1-475f-aefd-ffa93549afc7" (UID: "3aadaf85-3be1-475f-aefd-ffa93549afc7"). InnerVolumeSpecName "kube-api-access-wkwxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.331430 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aadaf85-3be1-475f-aefd-ffa93549afc7" (UID: "3aadaf85-3be1-475f-aefd-ffa93549afc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.341202 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.341256 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aadaf85-3be1-475f-aefd-ffa93549afc7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.341272 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkwxb\" (UniqueName: \"kubernetes.io/projected/3aadaf85-3be1-475f-aefd-ffa93549afc7-kube-api-access-wkwxb\") on node \"crc\" DevicePath \"\"" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.714278 4929 generic.go:334] "Generic (PLEG): container finished" podID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerID="52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3" exitCode=0 Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.714333 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerDied","Data":"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3"} Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.714371 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frt6x" event={"ID":"3aadaf85-3be1-475f-aefd-ffa93549afc7","Type":"ContainerDied","Data":"00a2275bf21fd760b914fe86f4edcfa75423a1b76e0b6d581cf241b711b15d08"} Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.714396 4929 scope.go:117] "RemoveContainer" containerID="52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.714596 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frt6x" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.746707 4929 scope.go:117] "RemoveContainer" containerID="5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367" Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.747637 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:35:01 crc kubenswrapper[4929]: I1002 13:35:01.758242 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-frt6x"] Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.167229 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" path="/var/lib/kubelet/pods/3aadaf85-3be1-475f-aefd-ffa93549afc7/volumes" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.227044 4929 scope.go:117] "RemoveContainer" containerID="61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.256006 4929 scope.go:117] "RemoveContainer" containerID="52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3" Oct 02 13:35:02 crc kubenswrapper[4929]: E1002 13:35:02.256571 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3\": container with ID starting with 52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3 not found: ID does not exist" containerID="52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.256633 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3"} err="failed to get container status \"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3\": rpc error: code = NotFound desc = could not find container \"52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3\": container with ID starting with 52d0d9b3cbc14ed816663ff1a8b9b2595331e5c88416db3593b565e4efd5b4d3 not found: ID does not exist" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.256666 4929 scope.go:117] "RemoveContainer" containerID="5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367" Oct 02 13:35:02 crc kubenswrapper[4929]: E1002 13:35:02.257037 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367\": container with ID starting with 5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367 not found: ID does not exist" containerID="5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.257094 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367"} err="failed to get container status \"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367\": rpc error: code = NotFound desc = could not find container \"5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367\": container with ID starting with 5bcf03e94f6b1fcf2ba2ead80cf92d776c5aa6655fd03f6cbfaae0d086a36367 not found: ID does not exist" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.257136 4929 scope.go:117] "RemoveContainer" containerID="61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07" Oct 02 13:35:02 crc kubenswrapper[4929]: E1002 13:35:02.257454 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07\": container with ID starting with 61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07 not found: ID does not exist" containerID="61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07" Oct 02 13:35:02 crc kubenswrapper[4929]: I1002 13:35:02.257496 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07"} err="failed to get container status \"61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07\": rpc error: code = NotFound desc = could not find container \"61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07\": container with ID starting with 61ef8da0c3e41d5ca4a797afc2323d527b79877da92f47492a2e2d644c283c07 not found: ID does not exist" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.678659 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:43 crc kubenswrapper[4929]: E1002 13:36:43.679744 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="registry-server" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.679762 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="registry-server" Oct 02 13:36:43 crc kubenswrapper[4929]: E1002 13:36:43.679805 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="extract-content" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.679811 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="extract-content" Oct 02 13:36:43 crc kubenswrapper[4929]: E1002 13:36:43.679823 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="extract-utilities" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.679831 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="extract-utilities" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.680373 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aadaf85-3be1-475f-aefd-ffa93549afc7" containerName="registry-server" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.682125 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.692726 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.811053 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6vj\" (UniqueName: \"kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.811158 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.811313 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.912934 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b6vj\" (UniqueName: \"kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.913329 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.913393 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.914144 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.914149 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:43 crc kubenswrapper[4929]: I1002 13:36:43.951364 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b6vj\" (UniqueName: \"kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj\") pod \"community-operators-njkg7\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:44 crc kubenswrapper[4929]: I1002 13:36:44.018242 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:44 crc kubenswrapper[4929]: I1002 13:36:44.594067 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:44 crc kubenswrapper[4929]: I1002 13:36:44.738357 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:36:44 crc kubenswrapper[4929]: I1002 13:36:44.738689 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:36:44 crc kubenswrapper[4929]: I1002 13:36:44.774977 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerStarted","Data":"f68a78899414c697f1634f420a858ad7c0842889e5132811c1bff6cf537fe6c4"} Oct 02 13:36:45 crc kubenswrapper[4929]: I1002 13:36:45.787586 4929 generic.go:334] "Generic (PLEG): container finished" podID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerID="316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8" exitCode=0 Oct 02 13:36:45 crc kubenswrapper[4929]: I1002 13:36:45.787688 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerDied","Data":"316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8"} Oct 02 13:36:45 crc kubenswrapper[4929]: I1002 13:36:45.790041 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:36:47 crc kubenswrapper[4929]: I1002 13:36:47.811265 4929 generic.go:334] "Generic (PLEG): container finished" podID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerID="81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4" exitCode=0 Oct 02 13:36:47 crc kubenswrapper[4929]: I1002 13:36:47.811401 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerDied","Data":"81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4"} Oct 02 13:36:48 crc kubenswrapper[4929]: I1002 13:36:48.824659 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerStarted","Data":"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989"} Oct 02 13:36:48 crc kubenswrapper[4929]: I1002 13:36:48.858695 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-njkg7" podStartSLOduration=3.453569179 podStartE2EDuration="5.858672447s" podCreationTimestamp="2025-10-02 13:36:43 +0000 UTC" firstStartedPulling="2025-10-02 13:36:45.789801458 +0000 UTC m=+8806.340167822" lastFinishedPulling="2025-10-02 13:36:48.194904716 +0000 UTC m=+8808.745271090" observedRunningTime="2025-10-02 13:36:48.849867312 +0000 UTC m=+8809.400233676" watchObservedRunningTime="2025-10-02 13:36:48.858672447 +0000 UTC m=+8809.409038811" Oct 02 13:36:54 crc kubenswrapper[4929]: I1002 13:36:54.018738 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:54 crc kubenswrapper[4929]: I1002 13:36:54.019230 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:54 crc kubenswrapper[4929]: I1002 13:36:54.072988 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:54 crc kubenswrapper[4929]: I1002 13:36:54.946223 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:55 crc kubenswrapper[4929]: I1002 13:36:55.000417 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:56 crc kubenswrapper[4929]: I1002 13:36:56.912189 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-njkg7" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="registry-server" containerID="cri-o://215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989" gracePeriod=2 Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.497102 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.643818 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities\") pod \"3bc57c08-f5de-4a57-b270-d082c7d490c9\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.643944 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b6vj\" (UniqueName: \"kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj\") pod \"3bc57c08-f5de-4a57-b270-d082c7d490c9\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.644077 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content\") pod \"3bc57c08-f5de-4a57-b270-d082c7d490c9\" (UID: \"3bc57c08-f5de-4a57-b270-d082c7d490c9\") " Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.645060 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities" (OuterVolumeSpecName: "utilities") pod "3bc57c08-f5de-4a57-b270-d082c7d490c9" (UID: "3bc57c08-f5de-4a57-b270-d082c7d490c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.654054 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj" (OuterVolumeSpecName: "kube-api-access-6b6vj") pod "3bc57c08-f5de-4a57-b270-d082c7d490c9" (UID: "3bc57c08-f5de-4a57-b270-d082c7d490c9"). InnerVolumeSpecName "kube-api-access-6b6vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.695847 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bc57c08-f5de-4a57-b270-d082c7d490c9" (UID: "3bc57c08-f5de-4a57-b270-d082c7d490c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.746539 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.746579 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b6vj\" (UniqueName: \"kubernetes.io/projected/3bc57c08-f5de-4a57-b270-d082c7d490c9-kube-api-access-6b6vj\") on node \"crc\" DevicePath \"\"" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.746591 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc57c08-f5de-4a57-b270-d082c7d490c9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.924359 4929 generic.go:334] "Generic (PLEG): container finished" podID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerID="215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989" exitCode=0 Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.924409 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerDied","Data":"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989"} Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.924445 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njkg7" event={"ID":"3bc57c08-f5de-4a57-b270-d082c7d490c9","Type":"ContainerDied","Data":"f68a78899414c697f1634f420a858ad7c0842889e5132811c1bff6cf537fe6c4"} Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.924466 4929 scope.go:117] "RemoveContainer" containerID="215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.924470 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njkg7" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.961490 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.964051 4929 scope.go:117] "RemoveContainer" containerID="81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4" Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.970387 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-njkg7"] Oct 02 13:36:57 crc kubenswrapper[4929]: I1002 13:36:57.985259 4929 scope.go:117] "RemoveContainer" containerID="316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.033816 4929 scope.go:117] "RemoveContainer" containerID="215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989" Oct 02 13:36:58 crc kubenswrapper[4929]: E1002 13:36:58.034349 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989\": container with ID starting with 215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989 not found: ID does not exist" containerID="215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.034464 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989"} err="failed to get container status \"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989\": rpc error: code = NotFound desc = could not find container \"215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989\": container with ID starting with 215ad9ca5add4855fb4c96a774f42e2ab2deb1483e90010ca1058fdafc921989 not found: ID does not exist" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.034566 4929 scope.go:117] "RemoveContainer" containerID="81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4" Oct 02 13:36:58 crc kubenswrapper[4929]: E1002 13:36:58.034980 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4\": container with ID starting with 81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4 not found: ID does not exist" containerID="81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.035014 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4"} err="failed to get container status \"81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4\": rpc error: code = NotFound desc = could not find container \"81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4\": container with ID starting with 81f98ede6fe57d0dcb03c4ad422c9f9650d5e3f98d67b61e486f3715c4c7eee4 not found: ID does not exist" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.035033 4929 scope.go:117] "RemoveContainer" containerID="316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8" Oct 02 13:36:58 crc kubenswrapper[4929]: E1002 13:36:58.035424 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8\": container with ID starting with 316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8 not found: ID does not exist" containerID="316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.035473 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8"} err="failed to get container status \"316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8\": rpc error: code = NotFound desc = could not find container \"316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8\": container with ID starting with 316eb6444f70a8d41a36fc781dcb7b1ba9a71573306bf7b8262803119e723fc8 not found: ID does not exist" Oct 02 13:36:58 crc kubenswrapper[4929]: I1002 13:36:58.168305 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" path="/var/lib/kubelet/pods/3bc57c08-f5de-4a57-b270-d082c7d490c9/volumes" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.364973 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:05 crc kubenswrapper[4929]: E1002 13:37:05.365686 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="registry-server" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.365698 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="registry-server" Oct 02 13:37:05 crc kubenswrapper[4929]: E1002 13:37:05.365739 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="extract-content" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.365746 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="extract-content" Oct 02 13:37:05 crc kubenswrapper[4929]: E1002 13:37:05.365761 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="extract-utilities" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.365767 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="extract-utilities" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.366003 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc57c08-f5de-4a57-b270-d082c7d490c9" containerName="registry-server" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.367603 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.390778 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.523123 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.524799 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdmb\" (UniqueName: \"kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.525005 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.627057 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdmb\" (UniqueName: \"kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.627113 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.627287 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.627845 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.627950 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.648505 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdmb\" (UniqueName: \"kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb\") pod \"redhat-marketplace-kjkng\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:05 crc kubenswrapper[4929]: I1002 13:37:05.687998 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:06 crc kubenswrapper[4929]: I1002 13:37:06.175168 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:07 crc kubenswrapper[4929]: I1002 13:37:07.024246 4929 generic.go:334] "Generic (PLEG): container finished" podID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerID="04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5" exitCode=0 Oct 02 13:37:07 crc kubenswrapper[4929]: I1002 13:37:07.024313 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerDied","Data":"04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5"} Oct 02 13:37:07 crc kubenswrapper[4929]: I1002 13:37:07.024643 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerStarted","Data":"3e3932c6062ea5f616b3e17ce81397b2f1ee1024356a9adcb0f86e606531d549"} Oct 02 13:37:08 crc kubenswrapper[4929]: I1002 13:37:08.036455 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerStarted","Data":"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32"} Oct 02 13:37:09 crc kubenswrapper[4929]: I1002 13:37:09.047090 4929 generic.go:334] "Generic (PLEG): container finished" podID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerID="cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32" exitCode=0 Oct 02 13:37:09 crc kubenswrapper[4929]: I1002 13:37:09.047141 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerDied","Data":"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32"} Oct 02 13:37:10 crc kubenswrapper[4929]: I1002 13:37:10.057998 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerStarted","Data":"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed"} Oct 02 13:37:10 crc kubenswrapper[4929]: I1002 13:37:10.080706 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjkng" podStartSLOduration=2.589854126 podStartE2EDuration="5.080681579s" podCreationTimestamp="2025-10-02 13:37:05 +0000 UTC" firstStartedPulling="2025-10-02 13:37:07.027755841 +0000 UTC m=+8827.578122215" lastFinishedPulling="2025-10-02 13:37:09.518583304 +0000 UTC m=+8830.068949668" observedRunningTime="2025-10-02 13:37:10.075108477 +0000 UTC m=+8830.625474841" watchObservedRunningTime="2025-10-02 13:37:10.080681579 +0000 UTC m=+8830.631047943" Oct 02 13:37:14 crc kubenswrapper[4929]: I1002 13:37:14.737087 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:37:14 crc kubenswrapper[4929]: I1002 13:37:14.737650 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.568534 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.571673 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.583640 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.689303 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.689377 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.739971 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmpp\" (UniqueName: \"kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.740778 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.740931 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.749109 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.843953 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.844065 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.844442 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.844686 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.845069 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmpp\" (UniqueName: \"kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.872134 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmpp\" (UniqueName: \"kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp\") pod \"certified-operators-nlhkr\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:15 crc kubenswrapper[4929]: I1002 13:37:15.946253 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:16 crc kubenswrapper[4929]: I1002 13:37:16.210710 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:16 crc kubenswrapper[4929]: I1002 13:37:16.499836 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:17 crc kubenswrapper[4929]: I1002 13:37:17.130295 4929 generic.go:334] "Generic (PLEG): container finished" podID="819ef93b-246f-47a3-8226-99b2e32abd19" containerID="87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e" exitCode=0 Oct 02 13:37:17 crc kubenswrapper[4929]: I1002 13:37:17.130467 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerDied","Data":"87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e"} Oct 02 13:37:17 crc kubenswrapper[4929]: I1002 13:37:17.130723 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerStarted","Data":"bc05b54878dc9ac7634b06add9ce3f1792ed1ffeb8407553a863957ec2b74f1e"} Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.136030 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.143823 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerStarted","Data":"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e"} Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.144113 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kjkng" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="registry-server" containerID="cri-o://c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed" gracePeriod=2 Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.671369 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.806824 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities\") pod \"2aef67db-3b5f-4db2-aed9-75f1f7692069\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.807105 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content\") pod \"2aef67db-3b5f-4db2-aed9-75f1f7692069\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.807180 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsdmb\" (UniqueName: \"kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb\") pod \"2aef67db-3b5f-4db2-aed9-75f1f7692069\" (UID: \"2aef67db-3b5f-4db2-aed9-75f1f7692069\") " Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.807841 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities" (OuterVolumeSpecName: "utilities") pod "2aef67db-3b5f-4db2-aed9-75f1f7692069" (UID: "2aef67db-3b5f-4db2-aed9-75f1f7692069"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.827337 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb" (OuterVolumeSpecName: "kube-api-access-dsdmb") pod "2aef67db-3b5f-4db2-aed9-75f1f7692069" (UID: "2aef67db-3b5f-4db2-aed9-75f1f7692069"). InnerVolumeSpecName "kube-api-access-dsdmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.836648 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aef67db-3b5f-4db2-aed9-75f1f7692069" (UID: "2aef67db-3b5f-4db2-aed9-75f1f7692069"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.910193 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsdmb\" (UniqueName: \"kubernetes.io/projected/2aef67db-3b5f-4db2-aed9-75f1f7692069-kube-api-access-dsdmb\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.910229 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:18 crc kubenswrapper[4929]: I1002 13:37:18.910246 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef67db-3b5f-4db2-aed9-75f1f7692069-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.155047 4929 generic.go:334] "Generic (PLEG): container finished" podID="819ef93b-246f-47a3-8226-99b2e32abd19" containerID="dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e" exitCode=0 Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.155095 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerDied","Data":"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e"} Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.160314 4929 generic.go:334] "Generic (PLEG): container finished" podID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerID="c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed" exitCode=0 Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.160365 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerDied","Data":"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed"} Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.160385 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjkng" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.160407 4929 scope.go:117] "RemoveContainer" containerID="c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.160396 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjkng" event={"ID":"2aef67db-3b5f-4db2-aed9-75f1f7692069","Type":"ContainerDied","Data":"3e3932c6062ea5f616b3e17ce81397b2f1ee1024356a9adcb0f86e606531d549"} Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.186685 4929 scope.go:117] "RemoveContainer" containerID="cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.207850 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.214444 4929 scope.go:117] "RemoveContainer" containerID="04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.218687 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjkng"] Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.261712 4929 scope.go:117] "RemoveContainer" containerID="c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed" Oct 02 13:37:19 crc kubenswrapper[4929]: E1002 13:37:19.262114 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed\": container with ID starting with c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed not found: ID does not exist" containerID="c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.262158 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed"} err="failed to get container status \"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed\": rpc error: code = NotFound desc = could not find container \"c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed\": container with ID starting with c261464abf7d43be2916e8d14794fca0dd20e768672a259c4b6b0d6db66d29ed not found: ID does not exist" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.262183 4929 scope.go:117] "RemoveContainer" containerID="cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32" Oct 02 13:37:19 crc kubenswrapper[4929]: E1002 13:37:19.262681 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32\": container with ID starting with cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32 not found: ID does not exist" containerID="cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.262705 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32"} err="failed to get container status \"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32\": rpc error: code = NotFound desc = could not find container \"cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32\": container with ID starting with cbf1770b697cb69e2082db68501b67ceddaa2cd1ff04bba9e10e69b256f0cb32 not found: ID does not exist" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.262718 4929 scope.go:117] "RemoveContainer" containerID="04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5" Oct 02 13:37:19 crc kubenswrapper[4929]: E1002 13:37:19.263145 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5\": container with ID starting with 04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5 not found: ID does not exist" containerID="04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5" Oct 02 13:37:19 crc kubenswrapper[4929]: I1002 13:37:19.263166 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5"} err="failed to get container status \"04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5\": rpc error: code = NotFound desc = could not find container \"04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5\": container with ID starting with 04bed5ae114521ae750b89308a5b76f5b0006c16d34dfe40b7404663ade5f6f5 not found: ID does not exist" Oct 02 13:37:20 crc kubenswrapper[4929]: I1002 13:37:20.173664 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" path="/var/lib/kubelet/pods/2aef67db-3b5f-4db2-aed9-75f1f7692069/volumes" Oct 02 13:37:20 crc kubenswrapper[4929]: I1002 13:37:20.178251 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerStarted","Data":"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f"} Oct 02 13:37:20 crc kubenswrapper[4929]: I1002 13:37:20.205729 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nlhkr" podStartSLOduration=2.725553586 podStartE2EDuration="5.205710321s" podCreationTimestamp="2025-10-02 13:37:15 +0000 UTC" firstStartedPulling="2025-10-02 13:37:17.132004351 +0000 UTC m=+8837.682370715" lastFinishedPulling="2025-10-02 13:37:19.612161086 +0000 UTC m=+8840.162527450" observedRunningTime="2025-10-02 13:37:20.198922834 +0000 UTC m=+8840.749289218" watchObservedRunningTime="2025-10-02 13:37:20.205710321 +0000 UTC m=+8840.756076685" Oct 02 13:37:25 crc kubenswrapper[4929]: I1002 13:37:25.946425 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:25 crc kubenswrapper[4929]: I1002 13:37:25.946932 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:25 crc kubenswrapper[4929]: I1002 13:37:25.996836 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:26 crc kubenswrapper[4929]: I1002 13:37:26.280614 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:26 crc kubenswrapper[4929]: I1002 13:37:26.329718 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:28 crc kubenswrapper[4929]: I1002 13:37:28.252922 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nlhkr" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="registry-server" containerID="cri-o://f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f" gracePeriod=2 Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.249134 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.272987 4929 generic.go:334] "Generic (PLEG): container finished" podID="819ef93b-246f-47a3-8226-99b2e32abd19" containerID="f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f" exitCode=0 Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.273032 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerDied","Data":"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f"} Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.273057 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlhkr" event={"ID":"819ef93b-246f-47a3-8226-99b2e32abd19","Type":"ContainerDied","Data":"bc05b54878dc9ac7634b06add9ce3f1792ed1ffeb8407553a863957ec2b74f1e"} Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.273076 4929 scope.go:117] "RemoveContainer" containerID="f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.273218 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlhkr" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.306365 4929 scope.go:117] "RemoveContainer" containerID="dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.320149 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content\") pod \"819ef93b-246f-47a3-8226-99b2e32abd19\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.320242 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities\") pod \"819ef93b-246f-47a3-8226-99b2e32abd19\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.320303 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttmpp\" (UniqueName: \"kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp\") pod \"819ef93b-246f-47a3-8226-99b2e32abd19\" (UID: \"819ef93b-246f-47a3-8226-99b2e32abd19\") " Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.321864 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities" (OuterVolumeSpecName: "utilities") pod "819ef93b-246f-47a3-8226-99b2e32abd19" (UID: "819ef93b-246f-47a3-8226-99b2e32abd19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.323252 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.327431 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp" (OuterVolumeSpecName: "kube-api-access-ttmpp") pod "819ef93b-246f-47a3-8226-99b2e32abd19" (UID: "819ef93b-246f-47a3-8226-99b2e32abd19"). InnerVolumeSpecName "kube-api-access-ttmpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.332234 4929 scope.go:117] "RemoveContainer" containerID="87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.369793 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "819ef93b-246f-47a3-8226-99b2e32abd19" (UID: "819ef93b-246f-47a3-8226-99b2e32abd19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.419854 4929 scope.go:117] "RemoveContainer" containerID="f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f" Oct 02 13:37:29 crc kubenswrapper[4929]: E1002 13:37:29.420310 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f\": container with ID starting with f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f not found: ID does not exist" containerID="f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.420347 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f"} err="failed to get container status \"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f\": rpc error: code = NotFound desc = could not find container \"f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f\": container with ID starting with f7dc3820a7399c8036d015e4b00f614fdbf2b0012c1da0f37386c7362255456f not found: ID does not exist" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.420372 4929 scope.go:117] "RemoveContainer" containerID="dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e" Oct 02 13:37:29 crc kubenswrapper[4929]: E1002 13:37:29.420627 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e\": container with ID starting with dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e not found: ID does not exist" containerID="dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.420655 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e"} err="failed to get container status \"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e\": rpc error: code = NotFound desc = could not find container \"dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e\": container with ID starting with dcda2b7c8709ec37a1a3d16075e00b93ae7dffe029d0a7500b9895cedc69db6e not found: ID does not exist" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.420672 4929 scope.go:117] "RemoveContainer" containerID="87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e" Oct 02 13:37:29 crc kubenswrapper[4929]: E1002 13:37:29.421071 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e\": container with ID starting with 87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e not found: ID does not exist" containerID="87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.421096 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e"} err="failed to get container status \"87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e\": rpc error: code = NotFound desc = could not find container \"87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e\": container with ID starting with 87adf9103c14a5d7579ab293fcef8c7c55300909c1cef967d04e8548cb48226e not found: ID does not exist" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.425719 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/819ef93b-246f-47a3-8226-99b2e32abd19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.425745 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttmpp\" (UniqueName: \"kubernetes.io/projected/819ef93b-246f-47a3-8226-99b2e32abd19-kube-api-access-ttmpp\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.606711 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:29 crc kubenswrapper[4929]: I1002 13:37:29.616448 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nlhkr"] Oct 02 13:37:30 crc kubenswrapper[4929]: I1002 13:37:30.177251 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" path="/var/lib/kubelet/pods/819ef93b-246f-47a3-8226-99b2e32abd19/volumes" Oct 02 13:37:43 crc kubenswrapper[4929]: I1002 13:37:43.409101 4929 generic.go:334] "Generic (PLEG): container finished" podID="e7bef8fe-3f7d-4798-b217-76996aab4a9f" containerID="baf0bdd48b3ffb2cd62cc61c26b47ce038a2150571346192dc2b88a8f1a4218a" exitCode=0 Oct 02 13:37:43 crc kubenswrapper[4929]: I1002 13:37:43.409197 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" event={"ID":"e7bef8fe-3f7d-4798-b217-76996aab4a9f","Type":"ContainerDied","Data":"baf0bdd48b3ffb2cd62cc61c26b47ce038a2150571346192dc2b88a8f1a4218a"} Oct 02 13:37:44 crc kubenswrapper[4929]: I1002 13:37:44.736326 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:37:44 crc kubenswrapper[4929]: I1002 13:37:44.736887 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:37:44 crc kubenswrapper[4929]: I1002 13:37:44.736979 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:37:44 crc kubenswrapper[4929]: I1002 13:37:44.738159 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:37:44 crc kubenswrapper[4929]: I1002 13:37:44.738230 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" gracePeriod=600 Oct 02 13:37:44 crc kubenswrapper[4929]: E1002 13:37:44.991490 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.231016 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.274610 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.274837 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m957g\" (UniqueName: \"kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.274893 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.274974 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.275092 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.275134 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle\") pod \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\" (UID: \"e7bef8fe-3f7d-4798-b217-76996aab4a9f\") " Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.292310 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.292398 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph" (OuterVolumeSpecName: "ceph") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.292527 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g" (OuterVolumeSpecName: "kube-api-access-m957g") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "kube-api-access-m957g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.307268 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.308309 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory" (OuterVolumeSpecName: "inventory") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.309331 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e7bef8fe-3f7d-4798-b217-76996aab4a9f" (UID: "e7bef8fe-3f7d-4798-b217-76996aab4a9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377668 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377701 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m957g\" (UniqueName: \"kubernetes.io/projected/e7bef8fe-3f7d-4798-b217-76996aab4a9f-kube-api-access-m957g\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377713 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377723 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377731 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.377739 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bef8fe-3f7d-4798-b217-76996aab4a9f-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.428519 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" event={"ID":"e7bef8fe-3f7d-4798-b217-76996aab4a9f","Type":"ContainerDied","Data":"6735721daa32c47081db22d22d4ebf75e191807597a7fe73f4b61eae06aaf58e"} Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.428650 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6735721daa32c47081db22d22d4ebf75e191807597a7fe73f4b61eae06aaf58e" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.428992 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gq6bf" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.431378 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" exitCode=0 Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.431411 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73"} Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.431432 4929 scope.go:117] "RemoveContainer" containerID="46462258430a3a49d52c657e107f70a0d80ed5020c6c7156d19d7bc9db5db7c4" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.432222 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.433290 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.538568 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qhj87"] Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539250 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="extract-utilities" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539267 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="extract-utilities" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539289 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="extract-content" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539296 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="extract-content" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539313 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="extract-content" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539321 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="extract-content" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539349 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539356 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539367 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="extract-utilities" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539372 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="extract-utilities" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539380 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539387 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: E1002 13:37:45.539399 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bef8fe-3f7d-4798-b217-76996aab4a9f" containerName="neutron-sriov-openstack-openstack-cell1" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539405 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bef8fe-3f7d-4798-b217-76996aab4a9f" containerName="neutron-sriov-openstack-openstack-cell1" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539594 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bef8fe-3f7d-4798-b217-76996aab4a9f" containerName="neutron-sriov-openstack-openstack-cell1" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539609 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="819ef93b-246f-47a3-8226-99b2e32abd19" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.539626 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aef67db-3b5f-4db2-aed9-75f1f7692069" containerName="registry-server" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.540429 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.543618 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.543866 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.543890 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.544031 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.544212 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.577629 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qhj87"] Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583618 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583771 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583797 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583823 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583854 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bwls\" (UniqueName: \"kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.583924 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.685804 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.685864 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.685910 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.685968 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bwls\" (UniqueName: \"kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.686066 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.686122 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.691225 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.691327 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.691766 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.692848 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.693270 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.706043 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bwls\" (UniqueName: \"kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls\") pod \"neutron-dhcp-openstack-openstack-cell1-qhj87\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:45 crc kubenswrapper[4929]: I1002 13:37:45.865319 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:37:46 crc kubenswrapper[4929]: I1002 13:37:46.384594 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qhj87"] Oct 02 13:37:46 crc kubenswrapper[4929]: I1002 13:37:46.444538 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" event={"ID":"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e","Type":"ContainerStarted","Data":"5adee236eb9317eb1525de1b6b0a235a8797ac45b3bf75b6a9d122bc9bc4b35d"} Oct 02 13:37:47 crc kubenswrapper[4929]: I1002 13:37:47.456369 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" event={"ID":"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e","Type":"ContainerStarted","Data":"e0f5ba3a6faf5e6fe5d9203d0cbcac2b9656cf1b7ef56cc793e02d13a252e7e6"} Oct 02 13:37:47 crc kubenswrapper[4929]: I1002 13:37:47.490514 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" podStartSLOduration=2.029445336 podStartE2EDuration="2.490490671s" podCreationTimestamp="2025-10-02 13:37:45 +0000 UTC" firstStartedPulling="2025-10-02 13:37:46.386193909 +0000 UTC m=+8866.936560273" lastFinishedPulling="2025-10-02 13:37:46.847239244 +0000 UTC m=+8867.397605608" observedRunningTime="2025-10-02 13:37:47.483111247 +0000 UTC m=+8868.033477611" watchObservedRunningTime="2025-10-02 13:37:47.490490671 +0000 UTC m=+8868.040857035" Oct 02 13:37:58 crc kubenswrapper[4929]: I1002 13:37:58.158575 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:37:58 crc kubenswrapper[4929]: E1002 13:37:58.160386 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:38:10 crc kubenswrapper[4929]: I1002 13:38:10.171653 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:38:10 crc kubenswrapper[4929]: E1002 13:38:10.172557 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:38:22 crc kubenswrapper[4929]: I1002 13:38:22.157326 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:38:22 crc kubenswrapper[4929]: E1002 13:38:22.158050 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:38:37 crc kubenswrapper[4929]: I1002 13:38:37.156620 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:38:37 crc kubenswrapper[4929]: E1002 13:38:37.157364 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:38:51 crc kubenswrapper[4929]: I1002 13:38:51.157058 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:38:51 crc kubenswrapper[4929]: E1002 13:38:51.157736 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:39:02 crc kubenswrapper[4929]: I1002 13:39:02.157533 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:39:02 crc kubenswrapper[4929]: E1002 13:39:02.158418 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:39:17 crc kubenswrapper[4929]: I1002 13:39:17.156793 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:39:17 crc kubenswrapper[4929]: E1002 13:39:17.157731 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:39:30 crc kubenswrapper[4929]: I1002 13:39:30.164353 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:39:30 crc kubenswrapper[4929]: E1002 13:39:30.165395 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:39:41 crc kubenswrapper[4929]: I1002 13:39:41.157523 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:39:41 crc kubenswrapper[4929]: E1002 13:39:41.158284 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:39:53 crc kubenswrapper[4929]: I1002 13:39:53.157058 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:39:53 crc kubenswrapper[4929]: E1002 13:39:53.157806 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:40:06 crc kubenswrapper[4929]: I1002 13:40:06.156844 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:40:06 crc kubenswrapper[4929]: E1002 13:40:06.158162 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:40:17 crc kubenswrapper[4929]: I1002 13:40:17.157706 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:40:17 crc kubenswrapper[4929]: E1002 13:40:17.158806 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:40:28 crc kubenswrapper[4929]: I1002 13:40:28.157493 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:40:28 crc kubenswrapper[4929]: E1002 13:40:28.158310 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:40:43 crc kubenswrapper[4929]: I1002 13:40:43.156930 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:40:43 crc kubenswrapper[4929]: E1002 13:40:43.157648 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:40:46 crc kubenswrapper[4929]: I1002 13:40:46.234595 4929 generic.go:334] "Generic (PLEG): container finished" podID="fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" containerID="e0f5ba3a6faf5e6fe5d9203d0cbcac2b9656cf1b7ef56cc793e02d13a252e7e6" exitCode=0 Oct 02 13:40:46 crc kubenswrapper[4929]: I1002 13:40:46.235030 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" event={"ID":"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e","Type":"ContainerDied","Data":"e0f5ba3a6faf5e6fe5d9203d0cbcac2b9656cf1b7ef56cc793e02d13a252e7e6"} Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.689793 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.816802 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.816879 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.817003 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.817055 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bwls\" (UniqueName: \"kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.817077 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.817113 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph\") pod \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\" (UID: \"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e\") " Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.822155 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.822213 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph" (OuterVolumeSpecName: "ceph") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.822550 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls" (OuterVolumeSpecName: "kube-api-access-5bwls") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "kube-api-access-5bwls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.844924 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.846850 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory" (OuterVolumeSpecName: "inventory") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.848480 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" (UID: "fc0badd8-a9ba-44d5-9ceb-600392fd2c2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920005 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920239 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bwls\" (UniqueName: \"kubernetes.io/projected/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-kube-api-access-5bwls\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920302 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920365 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920437 4929 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:47 crc kubenswrapper[4929]: I1002 13:40:47.920497 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc0badd8-a9ba-44d5-9ceb-600392fd2c2e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:40:48 crc kubenswrapper[4929]: I1002 13:40:48.259882 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" event={"ID":"fc0badd8-a9ba-44d5-9ceb-600392fd2c2e","Type":"ContainerDied","Data":"5adee236eb9317eb1525de1b6b0a235a8797ac45b3bf75b6a9d122bc9bc4b35d"} Oct 02 13:40:48 crc kubenswrapper[4929]: I1002 13:40:48.259926 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qhj87" Oct 02 13:40:48 crc kubenswrapper[4929]: I1002 13:40:48.261202 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5adee236eb9317eb1525de1b6b0a235a8797ac45b3bf75b6a9d122bc9bc4b35d" Oct 02 13:40:57 crc kubenswrapper[4929]: I1002 13:40:57.156646 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:40:57 crc kubenswrapper[4929]: E1002 13:40:57.157728 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:41:10 crc kubenswrapper[4929]: I1002 13:41:10.164524 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:41:10 crc kubenswrapper[4929]: E1002 13:41:10.165219 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:41:20 crc kubenswrapper[4929]: I1002 13:41:20.965559 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:20 crc kubenswrapper[4929]: I1002 13:41:20.968217 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerName="nova-cell0-conductor-conductor" containerID="cri-o://aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.156668 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:41:21 crc kubenswrapper[4929]: E1002 13:41:21.157032 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.592644 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.592918 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.741175 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.741542 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-api" containerID="cri-o://04d9f919abb6004df897eed30ca7db6741b0be4fa6e87201ab832dfa31f16088" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.741471 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-log" containerID="cri-o://4e323859df312ddcfa507ba095441a7a07d40e00202515d1ce964df782bb0cac" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.762157 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.762565 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerName="nova-scheduler-scheduler" containerID="cri-o://bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.924004 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.924261 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" containerID="cri-o://8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044" gracePeriod=30 Oct 02 13:41:21 crc kubenswrapper[4929]: I1002 13:41:21.924733 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" containerID="cri-o://5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0" gracePeriod=30 Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.612175 4929 generic.go:334] "Generic (PLEG): container finished" podID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerID="aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" exitCode=0 Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.612393 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2eabde0-bfe0-456f-a226-8eb41402dd41","Type":"ContainerDied","Data":"aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e"} Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.619372 4929 generic.go:334] "Generic (PLEG): container finished" podID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerID="4e323859df312ddcfa507ba095441a7a07d40e00202515d1ce964df782bb0cac" exitCode=143 Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.619450 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerDied","Data":"4e323859df312ddcfa507ba095441a7a07d40e00202515d1ce964df782bb0cac"} Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.625989 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerDied","Data":"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044"} Oct 02 13:41:22 crc kubenswrapper[4929]: I1002 13:41:22.625946 4929 generic.go:334] "Generic (PLEG): container finished" podID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerID="8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044" exitCode=143 Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.969917 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 is running failed: container process not found" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.969908 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e is running failed: container process not found" containerID="aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.971551 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 is running failed: container process not found" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.971713 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e is running failed: container process not found" containerID="aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.971916 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e is running failed: container process not found" containerID="aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.971945 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerName="nova-cell0-conductor-conductor" Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.972130 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 is running failed: container process not found" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 13:41:22 crc kubenswrapper[4929]: E1002 13:41:22.972176 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerName="nova-cell1-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:22.999000 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 is running failed: container process not found" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:22.999221 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 is running failed: container process not found" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.001803 4929 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 is running failed: container process not found" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.001837 4929 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerName="nova-scheduler-scheduler" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.091003 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.218561 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle\") pod \"a2eabde0-bfe0-456f-a226-8eb41402dd41\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.218625 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data\") pod \"a2eabde0-bfe0-456f-a226-8eb41402dd41\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.218775 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6jsg\" (UniqueName: \"kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg\") pod \"a2eabde0-bfe0-456f-a226-8eb41402dd41\" (UID: \"a2eabde0-bfe0-456f-a226-8eb41402dd41\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.228273 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg" (OuterVolumeSpecName: "kube-api-access-z6jsg") pod "a2eabde0-bfe0-456f-a226-8eb41402dd41" (UID: "a2eabde0-bfe0-456f-a226-8eb41402dd41"). InnerVolumeSpecName "kube-api-access-z6jsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.258585 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data" (OuterVolumeSpecName: "config-data") pod "a2eabde0-bfe0-456f-a226-8eb41402dd41" (UID: "a2eabde0-bfe0-456f-a226-8eb41402dd41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.269186 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2eabde0-bfe0-456f-a226-8eb41402dd41" (UID: "a2eabde0-bfe0-456f-a226-8eb41402dd41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.312644 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.319348 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.322137 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.322283 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2eabde0-bfe0-456f-a226-8eb41402dd41-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.322372 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6jsg\" (UniqueName: \"kubernetes.io/projected/a2eabde0-bfe0-456f-a226-8eb41402dd41-kube-api-access-z6jsg\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424000 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data\") pod \"ad8f9734-246a-44d2-8538-71b18a0ccabc\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424188 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data\") pod \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424268 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk7f7\" (UniqueName: \"kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7\") pod \"ad8f9734-246a-44d2-8538-71b18a0ccabc\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424314 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl4kv\" (UniqueName: \"kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv\") pod \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424660 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle\") pod \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\" (UID: \"e1e8b15f-d208-40d8-884c-3a80b25bbfcb\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.424737 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle\") pod \"ad8f9734-246a-44d2-8538-71b18a0ccabc\" (UID: \"ad8f9734-246a-44d2-8538-71b18a0ccabc\") " Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.427870 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7" (OuterVolumeSpecName: "kube-api-access-nk7f7") pod "ad8f9734-246a-44d2-8538-71b18a0ccabc" (UID: "ad8f9734-246a-44d2-8538-71b18a0ccabc"). InnerVolumeSpecName "kube-api-access-nk7f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.428444 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv" (OuterVolumeSpecName: "kube-api-access-xl4kv") pod "e1e8b15f-d208-40d8-884c-3a80b25bbfcb" (UID: "e1e8b15f-d208-40d8-884c-3a80b25bbfcb"). InnerVolumeSpecName "kube-api-access-xl4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.463442 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad8f9734-246a-44d2-8538-71b18a0ccabc" (UID: "ad8f9734-246a-44d2-8538-71b18a0ccabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.473427 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data" (OuterVolumeSpecName: "config-data") pod "e1e8b15f-d208-40d8-884c-3a80b25bbfcb" (UID: "e1e8b15f-d208-40d8-884c-3a80b25bbfcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.475523 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1e8b15f-d208-40d8-884c-3a80b25bbfcb" (UID: "e1e8b15f-d208-40d8-884c-3a80b25bbfcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.478286 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data" (OuterVolumeSpecName: "config-data") pod "ad8f9734-246a-44d2-8538-71b18a0ccabc" (UID: "ad8f9734-246a-44d2-8538-71b18a0ccabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527107 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527143 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527156 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad8f9734-246a-44d2-8538-71b18a0ccabc-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527193 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527201 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk7f7\" (UniqueName: \"kubernetes.io/projected/ad8f9734-246a-44d2-8538-71b18a0ccabc-kube-api-access-nk7f7\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.527211 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl4kv\" (UniqueName: \"kubernetes.io/projected/e1e8b15f-d208-40d8-884c-3a80b25bbfcb-kube-api-access-xl4kv\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.636259 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.636242 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2eabde0-bfe0-456f-a226-8eb41402dd41","Type":"ContainerDied","Data":"413c6c9b67f96226467a77d33005540529da63f095758d7ab567973179bad380"} Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.636407 4929 scope.go:117] "RemoveContainer" containerID="aa919ad40b66ca61868409dc4784ecc67a350a726743852bb65562d7aa7f906e" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.638318 4929 generic.go:334] "Generic (PLEG): container finished" podID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" exitCode=0 Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.638391 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.638403 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ad8f9734-246a-44d2-8538-71b18a0ccabc","Type":"ContainerDied","Data":"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69"} Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.638446 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ad8f9734-246a-44d2-8538-71b18a0ccabc","Type":"ContainerDied","Data":"c001dfc6e45118c1b51f95fa473e3ebd546e7a0f6d38b726f7f939cf215f09f1"} Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.639810 4929 generic.go:334] "Generic (PLEG): container finished" podID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" exitCode=0 Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.639870 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.639891 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1e8b15f-d208-40d8-884c-3a80b25bbfcb","Type":"ContainerDied","Data":"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813"} Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.640104 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1e8b15f-d208-40d8-884c-3a80b25bbfcb","Type":"ContainerDied","Data":"789890918cc796bfc8d773cbb8288b01e6c832e7a4a66ce1bb0f261690d63d50"} Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.666660 4929 scope.go:117] "RemoveContainer" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.673039 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.684164 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.695208 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.703399 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.709998 4929 scope.go:117] "RemoveContainer" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.710434 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69\": container with ID starting with 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 not found: ID does not exist" containerID="4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.710479 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69"} err="failed to get container status \"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69\": rpc error: code = NotFound desc = could not find container \"4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69\": container with ID starting with 4c17b4bc1ae21783563abb2fb4fbd414835587252081742345ce5066f6016a69 not found: ID does not exist" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.710500 4929 scope.go:117] "RemoveContainer" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.715598 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.716116 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerName="nova-cell0-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716136 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerName="nova-cell0-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.716155 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerName="nova-scheduler-scheduler" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716162 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerName="nova-scheduler-scheduler" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.716175 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716181 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.716193 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerName="nova-cell1-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716201 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerName="nova-cell1-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716443 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" containerName="nova-scheduler-scheduler" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716471 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" containerName="nova-cell1-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716481 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" containerName="nova-cell0-conductor-conductor" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.716493 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc0badd8-a9ba-44d5-9ceb-600392fd2c2e" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.717363 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.723666 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.726105 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.731225 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.731529 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.731784 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5r8\" (UniqueName: \"kubernetes.io/projected/b2217934-7871-4af0-a947-bcc4c0e7c63d-kube-api-access-ld5r8\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.733883 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.744030 4929 scope.go:117] "RemoveContainer" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" Oct 02 13:41:23 crc kubenswrapper[4929]: E1002 13:41:23.747266 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813\": container with ID starting with bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 not found: ID does not exist" containerID="bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.747353 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813"} err="failed to get container status \"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813\": rpc error: code = NotFound desc = could not find container \"bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813\": container with ID starting with bfe369bd3c5ad928bb95f7ae6b70a31c1890891e38d27afd5d7957e766ce3813 not found: ID does not exist" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.754458 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.755945 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.759601 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.766161 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.777345 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.779181 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.781303 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.787644 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.798640 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834002 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5r8\" (UniqueName: \"kubernetes.io/projected/b2217934-7871-4af0-a947-bcc4c0e7c63d-kube-api-access-ld5r8\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834075 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-config-data\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834242 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834336 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jv5\" (UniqueName: \"kubernetes.io/projected/900ac3ae-014e-4278-b930-71cdabe0da9d-kube-api-access-m8jv5\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834375 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834525 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834626 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834688 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxdpd\" (UniqueName: \"kubernetes.io/projected/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-kube-api-access-rxdpd\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.834729 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.838230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.843689 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2217934-7871-4af0-a947-bcc4c0e7c63d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.849377 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5r8\" (UniqueName: \"kubernetes.io/projected/b2217934-7871-4af0-a947-bcc4c0e7c63d-kube-api-access-ld5r8\") pod \"nova-cell1-conductor-0\" (UID: \"b2217934-7871-4af0-a947-bcc4c0e7c63d\") " pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937125 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jv5\" (UniqueName: \"kubernetes.io/projected/900ac3ae-014e-4278-b930-71cdabe0da9d-kube-api-access-m8jv5\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937215 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937268 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxdpd\" (UniqueName: \"kubernetes.io/projected/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-kube-api-access-rxdpd\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937303 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937364 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-config-data\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.937396 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.940215 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.940572 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.941425 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.941504 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900ac3ae-014e-4278-b930-71cdabe0da9d-config-data\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.960148 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jv5\" (UniqueName: \"kubernetes.io/projected/900ac3ae-014e-4278-b930-71cdabe0da9d-kube-api-access-m8jv5\") pod \"nova-scheduler-0\" (UID: \"900ac3ae-014e-4278-b930-71cdabe0da9d\") " pod="openstack/nova-scheduler-0" Oct 02 13:41:23 crc kubenswrapper[4929]: I1002 13:41:23.960467 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxdpd\" (UniqueName: \"kubernetes.io/projected/dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac-kube-api-access-rxdpd\") pod \"nova-cell0-conductor-0\" (UID: \"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac\") " pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.058414 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.082838 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.104694 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.176607 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2eabde0-bfe0-456f-a226-8eb41402dd41" path="/var/lib/kubelet/pods/a2eabde0-bfe0-456f-a226-8eb41402dd41/volumes" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.178007 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8f9734-246a-44d2-8538-71b18a0ccabc" path="/var/lib/kubelet/pods/ad8f9734-246a-44d2-8538-71b18a0ccabc/volumes" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.178511 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e8b15f-d208-40d8-884c-3a80b25bbfcb" path="/var/lib/kubelet/pods/e1e8b15f-d208-40d8-884c-3a80b25bbfcb/volumes" Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.549569 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.637636 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.658719 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b2217934-7871-4af0-a947-bcc4c0e7c63d","Type":"ContainerStarted","Data":"e528679bb0051b08dc7d77aa3621e89ac8ab1ed337c6d9a6036a0d8627427a10"} Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.659907 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"900ac3ae-014e-4278-b930-71cdabe0da9d","Type":"ContainerStarted","Data":"c40fcb80c4f4f7d27242c173d76a13830f7b013c1002ebc729b3509dc3aefe6f"} Oct 02 13:41:24 crc kubenswrapper[4929]: I1002 13:41:24.803222 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 13:41:24 crc kubenswrapper[4929]: W1002 13:41:24.818385 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc2e75da_54a2_4f25_a6f5_151c7a5bc0ac.slice/crio-d6b35bfbbab41f2cb1e772e2d7c8640f62734224bb7b617bf818634dc4ac47ab WatchSource:0}: Error finding container d6b35bfbbab41f2cb1e772e2d7c8640f62734224bb7b617bf818634dc4ac47ab: Status 404 returned error can't find the container with id d6b35bfbbab41f2cb1e772e2d7c8640f62734224bb7b617bf818634dc4ac47ab Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.503488 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:41280->10.217.1.84:8775: read: connection reset by peer" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.504393 4929 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.84:8775/\": read tcp 10.217.0.2:41288->10.217.1.84:8775: read: connection reset by peer" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.679594 4929 generic.go:334] "Generic (PLEG): container finished" podID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerID="04d9f919abb6004df897eed30ca7db6741b0be4fa6e87201ab832dfa31f16088" exitCode=0 Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.679661 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerDied","Data":"04d9f919abb6004df897eed30ca7db6741b0be4fa6e87201ab832dfa31f16088"} Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.681194 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"900ac3ae-014e-4278-b930-71cdabe0da9d","Type":"ContainerStarted","Data":"37f41ba0cc42f5027eb507ff32cf1253dc2c236c672f3e07348468d79b208a9a"} Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.684812 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac","Type":"ContainerStarted","Data":"b911390a1a224ce0f30aa92d0784d5b4eaabdf8d0d3adbfd6325e2279c3b9c95"} Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.684850 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac","Type":"ContainerStarted","Data":"d6b35bfbbab41f2cb1e772e2d7c8640f62734224bb7b617bf818634dc4ac47ab"} Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.685533 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.691115 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b2217934-7871-4af0-a947-bcc4c0e7c63d","Type":"ContainerStarted","Data":"b078275099dd7d9d31579f1d56efd1431fd870a0a154186b6a39abb971461bf5"} Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.692223 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.713398 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.713380013 podStartE2EDuration="2.713380013s" podCreationTimestamp="2025-10-02 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:41:25.708866343 +0000 UTC m=+9086.259232697" watchObservedRunningTime="2025-10-02 13:41:25.713380013 +0000 UTC m=+9086.263746377" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.741897 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.741876989 podStartE2EDuration="2.741876989s" podCreationTimestamp="2025-10-02 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:41:25.731026185 +0000 UTC m=+9086.281392549" watchObservedRunningTime="2025-10-02 13:41:25.741876989 +0000 UTC m=+9086.292243353" Oct 02 13:41:25 crc kubenswrapper[4929]: I1002 13:41:25.760787 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.760765197 podStartE2EDuration="2.760765197s" podCreationTimestamp="2025-10-02 13:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:41:25.748977205 +0000 UTC m=+9086.299343569" watchObservedRunningTime="2025-10-02 13:41:25.760765197 +0000 UTC m=+9086.311131561" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.175911 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.342744 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data\") pod \"56e885b6-4a14-4f86-9706-4c4df78b2704\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.342941 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clpsr\" (UniqueName: \"kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr\") pod \"56e885b6-4a14-4f86-9706-4c4df78b2704\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.343087 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs\") pod \"56e885b6-4a14-4f86-9706-4c4df78b2704\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.343193 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle\") pod \"56e885b6-4a14-4f86-9706-4c4df78b2704\" (UID: \"56e885b6-4a14-4f86-9706-4c4df78b2704\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.345465 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs" (OuterVolumeSpecName: "logs") pod "56e885b6-4a14-4f86-9706-4c4df78b2704" (UID: "56e885b6-4a14-4f86-9706-4c4df78b2704"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.350998 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr" (OuterVolumeSpecName: "kube-api-access-clpsr") pod "56e885b6-4a14-4f86-9706-4c4df78b2704" (UID: "56e885b6-4a14-4f86-9706-4c4df78b2704"). InnerVolumeSpecName "kube-api-access-clpsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.399482 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e885b6-4a14-4f86-9706-4c4df78b2704" (UID: "56e885b6-4a14-4f86-9706-4c4df78b2704"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.400171 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data" (OuterVolumeSpecName: "config-data") pod "56e885b6-4a14-4f86-9706-4c4df78b2704" (UID: "56e885b6-4a14-4f86-9706-4c4df78b2704"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.445586 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.445858 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clpsr\" (UniqueName: \"kubernetes.io/projected/56e885b6-4a14-4f86-9706-4c4df78b2704-kube-api-access-clpsr\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.445870 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e885b6-4a14-4f86-9706-4c4df78b2704-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.445879 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e885b6-4a14-4f86-9706-4c4df78b2704-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.498236 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.648587 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22cwq\" (UniqueName: \"kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq\") pod \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.648803 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle\") pod \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.648883 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs\") pod \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.648930 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data\") pod \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\" (UID: \"33f6d9fc-e75a-4cb6-a760-bcef63340ba0\") " Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.650064 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs" (OuterVolumeSpecName: "logs") pod "33f6d9fc-e75a-4cb6-a760-bcef63340ba0" (UID: "33f6d9fc-e75a-4cb6-a760-bcef63340ba0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.651714 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq" (OuterVolumeSpecName: "kube-api-access-22cwq") pod "33f6d9fc-e75a-4cb6-a760-bcef63340ba0" (UID: "33f6d9fc-e75a-4cb6-a760-bcef63340ba0"). InnerVolumeSpecName "kube-api-access-22cwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.679328 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33f6d9fc-e75a-4cb6-a760-bcef63340ba0" (UID: "33f6d9fc-e75a-4cb6-a760-bcef63340ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.686453 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data" (OuterVolumeSpecName: "config-data") pod "33f6d9fc-e75a-4cb6-a760-bcef63340ba0" (UID: "33f6d9fc-e75a-4cb6-a760-bcef63340ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.725779 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56e885b6-4a14-4f86-9706-4c4df78b2704","Type":"ContainerDied","Data":"4f59cda9d8055dc2bccf6d37f1da2627f1e4dfe49decbe13c481ef036b353386"} Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.725845 4929 scope.go:117] "RemoveContainer" containerID="04d9f919abb6004df897eed30ca7db6741b0be4fa6e87201ab832dfa31f16088" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.726032 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.744554 4929 generic.go:334] "Generic (PLEG): container finished" podID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerID="5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0" exitCode=0 Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.744667 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.745364 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerDied","Data":"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0"} Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.745412 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33f6d9fc-e75a-4cb6-a760-bcef63340ba0","Type":"ContainerDied","Data":"7f4ae9a158290492b9867e814282ef3774c3829d94a116b84855ad6c36b7f97d"} Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.751150 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22cwq\" (UniqueName: \"kubernetes.io/projected/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-kube-api-access-22cwq\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.751183 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.751193 4929 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.751202 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f6d9fc-e75a-4cb6-a760-bcef63340ba0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.765632 4929 scope.go:117] "RemoveContainer" containerID="4e323859df312ddcfa507ba095441a7a07d40e00202515d1ce964df782bb0cac" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.769911 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.785469 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.800375 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.808935 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.809525 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-log" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809549 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-log" Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.809570 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-api" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809580 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-api" Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.809628 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809637 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.809657 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809664 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809918 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-log" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.809936 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-metadata" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.810058 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" containerName="nova-api-api" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.810079 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" containerName="nova-metadata-log" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.811756 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.813855 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.818450 4929 scope.go:117] "RemoveContainer" containerID="5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.832897 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.853393 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.871470 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.873856 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.876824 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.884179 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.892449 4929 scope.go:117] "RemoveContainer" containerID="8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.924214 4929 scope.go:117] "RemoveContainer" containerID="5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0" Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.924627 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0\": container with ID starting with 5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0 not found: ID does not exist" containerID="5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.924691 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0"} err="failed to get container status \"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0\": rpc error: code = NotFound desc = could not find container \"5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0\": container with ID starting with 5dfe250c38a09c81bf62544a2c44139e0a6f887b7f11697c6d7cccfadb0359e0 not found: ID does not exist" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.924718 4929 scope.go:117] "RemoveContainer" containerID="8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044" Oct 02 13:41:26 crc kubenswrapper[4929]: E1002 13:41:26.925075 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044\": container with ID starting with 8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044 not found: ID does not exist" containerID="8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.925114 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044"} err="failed to get container status \"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044\": rpc error: code = NotFound desc = could not find container \"8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044\": container with ID starting with 8fff1dac32846e5d7e5e72779f457cf87c083faa8afa78af33c8658e4b705044 not found: ID does not exist" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.956586 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-config-data\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.956713 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.956749 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-config-data\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.956769 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkmvx\" (UniqueName: \"kubernetes.io/projected/74203203-c2a9-4137-be9b-cc65ad40f5bf-kube-api-access-mkmvx\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.956804 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7lzn\" (UniqueName: \"kubernetes.io/projected/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-kube-api-access-n7lzn\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.957059 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74203203-c2a9-4137-be9b-cc65ad40f5bf-logs\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.957124 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:26 crc kubenswrapper[4929]: I1002 13:41:26.957172 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-logs\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.058937 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74203203-c2a9-4137-be9b-cc65ad40f5bf-logs\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059059 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059110 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-logs\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059151 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-config-data\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059190 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059215 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-config-data\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059239 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkmvx\" (UniqueName: \"kubernetes.io/projected/74203203-c2a9-4137-be9b-cc65ad40f5bf-kube-api-access-mkmvx\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059269 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7lzn\" (UniqueName: \"kubernetes.io/projected/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-kube-api-access-n7lzn\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.059479 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74203203-c2a9-4137-be9b-cc65ad40f5bf-logs\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.060811 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-logs\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.066666 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.067415 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-config-data\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.069114 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.077743 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74203203-c2a9-4137-be9b-cc65ad40f5bf-config-data\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.080030 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkmvx\" (UniqueName: \"kubernetes.io/projected/74203203-c2a9-4137-be9b-cc65ad40f5bf-kube-api-access-mkmvx\") pod \"nova-metadata-0\" (UID: \"74203203-c2a9-4137-be9b-cc65ad40f5bf\") " pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.082305 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7lzn\" (UniqueName: \"kubernetes.io/projected/2dd72841-9f27-48f6-9e2c-1de4e8d50e72-kube-api-access-n7lzn\") pod \"nova-api-0\" (UID: \"2dd72841-9f27-48f6-9e2c-1de4e8d50e72\") " pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.138618 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.207671 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.648604 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 13:41:27 crc kubenswrapper[4929]: I1002 13:41:27.780690 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 13:41:27 crc kubenswrapper[4929]: W1002 13:41:27.992320 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dd72841_9f27_48f6_9e2c_1de4e8d50e72.slice/crio-1bc7ddca718fdfb645f5eb0343a9e051c778795e950536a6a857decca82c26d6 WatchSource:0}: Error finding container 1bc7ddca718fdfb645f5eb0343a9e051c778795e950536a6a857decca82c26d6: Status 404 returned error can't find the container with id 1bc7ddca718fdfb645f5eb0343a9e051c778795e950536a6a857decca82c26d6 Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.212706 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f6d9fc-e75a-4cb6-a760-bcef63340ba0" path="/var/lib/kubelet/pods/33f6d9fc-e75a-4cb6-a760-bcef63340ba0/volumes" Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.214285 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e885b6-4a14-4f86-9706-4c4df78b2704" path="/var/lib/kubelet/pods/56e885b6-4a14-4f86-9706-4c4df78b2704/volumes" Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.774326 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dd72841-9f27-48f6-9e2c-1de4e8d50e72","Type":"ContainerStarted","Data":"d413203520ec8671cb6b5b271da79f171e7d9bd1256f13191725a70a38d95057"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.774591 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dd72841-9f27-48f6-9e2c-1de4e8d50e72","Type":"ContainerStarted","Data":"b2d5227a03246df8ad9d9279e01c6ee98e2379626b011af9c3499dae396079dc"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.774601 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dd72841-9f27-48f6-9e2c-1de4e8d50e72","Type":"ContainerStarted","Data":"1bc7ddca718fdfb645f5eb0343a9e051c778795e950536a6a857decca82c26d6"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.775929 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74203203-c2a9-4137-be9b-cc65ad40f5bf","Type":"ContainerStarted","Data":"1238f5ce05845c9ecb7c2cc1064dfc490fc6c104d99ecaf740d25157051033a5"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.775990 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74203203-c2a9-4137-be9b-cc65ad40f5bf","Type":"ContainerStarted","Data":"fd5f79796b1f401ef27a4efeff9577ef75e523f40b29d2311a3daed8a0d32c03"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.776005 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74203203-c2a9-4137-be9b-cc65ad40f5bf","Type":"ContainerStarted","Data":"ff6bc2c7a4aa120c753a7060837dd6acd263593d72b03817de7ab8f7eab8dce2"} Oct 02 13:41:28 crc kubenswrapper[4929]: I1002 13:41:28.799157 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.799138803 podStartE2EDuration="2.799138803s" podCreationTimestamp="2025-10-02 13:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:41:28.7952482 +0000 UTC m=+9089.345614564" watchObservedRunningTime="2025-10-02 13:41:28.799138803 +0000 UTC m=+9089.349505167" Oct 02 13:41:29 crc kubenswrapper[4929]: I1002 13:41:29.083555 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 13:41:29 crc kubenswrapper[4929]: I1002 13:41:29.092187 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 13:41:29 crc kubenswrapper[4929]: I1002 13:41:29.112113 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.112093685 podStartE2EDuration="3.112093685s" podCreationTimestamp="2025-10-02 13:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:41:28.822926102 +0000 UTC m=+9089.373292466" watchObservedRunningTime="2025-10-02 13:41:29.112093685 +0000 UTC m=+9089.662460049" Oct 02 13:41:32 crc kubenswrapper[4929]: I1002 13:41:32.157542 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:41:32 crc kubenswrapper[4929]: E1002 13:41:32.159331 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:41:32 crc kubenswrapper[4929]: I1002 13:41:32.207905 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 13:41:32 crc kubenswrapper[4929]: I1002 13:41:32.207988 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 13:41:34 crc kubenswrapper[4929]: I1002 13:41:34.083373 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 13:41:34 crc kubenswrapper[4929]: I1002 13:41:34.123624 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 13:41:34 crc kubenswrapper[4929]: I1002 13:41:34.139313 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 13:41:34 crc kubenswrapper[4929]: I1002 13:41:34.871224 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 13:41:37 crc kubenswrapper[4929]: I1002 13:41:37.139366 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 13:41:37 crc kubenswrapper[4929]: I1002 13:41:37.139656 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 13:41:37 crc kubenswrapper[4929]: I1002 13:41:37.208357 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 13:41:37 crc kubenswrapper[4929]: I1002 13:41:37.208392 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 13:41:38 crc kubenswrapper[4929]: I1002 13:41:38.222278 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dd72841-9f27-48f6-9e2c-1de4e8d50e72" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 13:41:38 crc kubenswrapper[4929]: I1002 13:41:38.222278 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dd72841-9f27-48f6-9e2c-1de4e8d50e72" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 13:41:38 crc kubenswrapper[4929]: I1002 13:41:38.305185 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74203203-c2a9-4137-be9b-cc65ad40f5bf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.195:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 13:41:38 crc kubenswrapper[4929]: I1002 13:41:38.305217 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74203203-c2a9-4137-be9b-cc65ad40f5bf" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.195:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 13:41:45 crc kubenswrapper[4929]: I1002 13:41:45.157503 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:41:45 crc kubenswrapper[4929]: E1002 13:41:45.158261 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.143341 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.144568 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.144850 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.148753 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.210164 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.211072 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.212054 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.960209 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.962075 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 13:41:47 crc kubenswrapper[4929]: I1002 13:41:47.963371 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.971547 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98"] Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.973289 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.976376 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.976691 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.976885 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.977051 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.977715 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.977825 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dxc54" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.977985 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980349 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980404 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980427 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980450 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980485 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980510 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v9s9\" (UniqueName: \"kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980536 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980581 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980645 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980666 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:48 crc kubenswrapper[4929]: I1002 13:41:48.980693 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.005702 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98"] Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082184 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082246 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v9s9\" (UniqueName: \"kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082285 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082340 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082377 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082394 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082425 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082477 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082524 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082542 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.082570 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.083452 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.083478 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.088737 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.088935 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.089384 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.090395 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.091387 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.092164 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.092701 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.098541 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.105416 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v9s9\" (UniqueName: \"kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.311153 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.869588 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98"] Oct 02 13:41:49 crc kubenswrapper[4929]: W1002 13:41:49.872618 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e41a2c6_d1f1_45c5_92a6_f7cf301ad241.slice/crio-0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df WatchSource:0}: Error finding container 0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df: Status 404 returned error can't find the container with id 0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.875366 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:41:49 crc kubenswrapper[4929]: I1002 13:41:49.983788 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" event={"ID":"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241","Type":"ContainerStarted","Data":"0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df"} Oct 02 13:41:51 crc kubenswrapper[4929]: I1002 13:41:50.999613 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" event={"ID":"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241","Type":"ContainerStarted","Data":"d7b1db0064c088b3382169bc6ebcc36c80a722a99d1a2da1d3f4e7ab07386b71"} Oct 02 13:41:51 crc kubenswrapper[4929]: I1002 13:41:51.028943 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" podStartSLOduration=2.53294391 podStartE2EDuration="3.028925518s" podCreationTimestamp="2025-10-02 13:41:48 +0000 UTC" firstStartedPulling="2025-10-02 13:41:49.875108711 +0000 UTC m=+9110.425475075" lastFinishedPulling="2025-10-02 13:41:50.371090319 +0000 UTC m=+9110.921456683" observedRunningTime="2025-10-02 13:41:51.019524636 +0000 UTC m=+9111.569891030" watchObservedRunningTime="2025-10-02 13:41:51.028925518 +0000 UTC m=+9111.579291882" Oct 02 13:41:58 crc kubenswrapper[4929]: I1002 13:41:58.156450 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:41:58 crc kubenswrapper[4929]: E1002 13:41:58.157318 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:42:13 crc kubenswrapper[4929]: I1002 13:42:13.158194 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:42:13 crc kubenswrapper[4929]: E1002 13:42:13.159399 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:42:27 crc kubenswrapper[4929]: I1002 13:42:27.156443 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:42:27 crc kubenswrapper[4929]: E1002 13:42:27.157213 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:42:42 crc kubenswrapper[4929]: I1002 13:42:42.157100 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:42:42 crc kubenswrapper[4929]: E1002 13:42:42.158926 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:42:53 crc kubenswrapper[4929]: I1002 13:42:53.157711 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:42:53 crc kubenswrapper[4929]: I1002 13:42:53.622868 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca"} Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.092050 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.096207 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.100736 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.184466 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2"] Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.186467 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.191900 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.192306 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.220559 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2"] Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.267439 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9ft\" (UniqueName: \"kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.268426 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.268480 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370535 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370614 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370652 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370737 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9ft\" (UniqueName: \"kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370792 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9x4\" (UniqueName: \"kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.370841 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.373230 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.373665 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.403856 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9ft\" (UniqueName: \"kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft\") pod \"redhat-operators-x95dc\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.423512 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.473216 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.473429 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9x4\" (UniqueName: \"kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.473488 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.474603 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.477097 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.493877 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9x4\" (UniqueName: \"kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4\") pod \"collect-profiles-29323545-xlgv2\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.517736 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:00 crc kubenswrapper[4929]: I1002 13:45:00.973407 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:01 crc kubenswrapper[4929]: I1002 13:45:01.099728 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2"] Oct 02 13:45:01 crc kubenswrapper[4929]: I1002 13:45:01.102276 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerStarted","Data":"ec5464b1d73c8b9481dcfe0dbbd1efe4d20a47f6f6c5f579c6e774163d8a374b"} Oct 02 13:45:02 crc kubenswrapper[4929]: I1002 13:45:02.112255 4929 generic.go:334] "Generic (PLEG): container finished" podID="bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" containerID="3f812e957a6f5d0eb1a5e9750bebb2a4a8ea23650546115d681a42d1917a3ebb" exitCode=0 Oct 02 13:45:02 crc kubenswrapper[4929]: I1002 13:45:02.112472 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" event={"ID":"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574","Type":"ContainerDied","Data":"3f812e957a6f5d0eb1a5e9750bebb2a4a8ea23650546115d681a42d1917a3ebb"} Oct 02 13:45:02 crc kubenswrapper[4929]: I1002 13:45:02.112555 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" event={"ID":"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574","Type":"ContainerStarted","Data":"4de5392d24f7c3d5b353c79fd549d1cab1da613456ae870e4a6c0ce9a23a7a21"} Oct 02 13:45:02 crc kubenswrapper[4929]: I1002 13:45:02.113913 4929 generic.go:334] "Generic (PLEG): container finished" podID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerID="b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b" exitCode=0 Oct 02 13:45:02 crc kubenswrapper[4929]: I1002 13:45:02.113953 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerDied","Data":"b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b"} Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.538751 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.656276 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume\") pod \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.656416 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume\") pod \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.656495 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg9x4\" (UniqueName: \"kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4\") pod \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\" (UID: \"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574\") " Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.657287 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume" (OuterVolumeSpecName: "config-volume") pod "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" (UID: "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.662306 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" (UID: "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.662891 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4" (OuterVolumeSpecName: "kube-api-access-fg9x4") pod "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" (UID: "bbb4e74f-d5e4-4e44-91fd-88fb3f66d574"). InnerVolumeSpecName "kube-api-access-fg9x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.759745 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.759777 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:03 crc kubenswrapper[4929]: I1002 13:45:03.759787 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg9x4\" (UniqueName: \"kubernetes.io/projected/bbb4e74f-d5e4-4e44-91fd-88fb3f66d574-kube-api-access-fg9x4\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.139557 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerStarted","Data":"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4"} Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.142693 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" event={"ID":"bbb4e74f-d5e4-4e44-91fd-88fb3f66d574","Type":"ContainerDied","Data":"4de5392d24f7c3d5b353c79fd549d1cab1da613456ae870e4a6c0ce9a23a7a21"} Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.142721 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de5392d24f7c3d5b353c79fd549d1cab1da613456ae870e4a6c0ce9a23a7a21" Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.142787 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323545-xlgv2" Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.612764 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t"] Oct 02 13:45:04 crc kubenswrapper[4929]: I1002 13:45:04.622892 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-2944t"] Oct 02 13:45:05 crc kubenswrapper[4929]: I1002 13:45:05.154381 4929 generic.go:334] "Generic (PLEG): container finished" podID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerID="cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4" exitCode=0 Oct 02 13:45:05 crc kubenswrapper[4929]: I1002 13:45:05.154469 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerDied","Data":"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4"} Oct 02 13:45:06 crc kubenswrapper[4929]: I1002 13:45:06.183717 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a98dc16-e89b-4248-96b9-82977c83f774" path="/var/lib/kubelet/pods/3a98dc16-e89b-4248-96b9-82977c83f774/volumes" Oct 02 13:45:06 crc kubenswrapper[4929]: I1002 13:45:06.186994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerStarted","Data":"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345"} Oct 02 13:45:06 crc kubenswrapper[4929]: I1002 13:45:06.227263 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x95dc" podStartSLOduration=2.693707185 podStartE2EDuration="6.227242995s" podCreationTimestamp="2025-10-02 13:45:00 +0000 UTC" firstStartedPulling="2025-10-02 13:45:02.116505473 +0000 UTC m=+9302.666871837" lastFinishedPulling="2025-10-02 13:45:05.650041283 +0000 UTC m=+9306.200407647" observedRunningTime="2025-10-02 13:45:06.216799242 +0000 UTC m=+9306.767165596" watchObservedRunningTime="2025-10-02 13:45:06.227242995 +0000 UTC m=+9306.777609359" Oct 02 13:45:10 crc kubenswrapper[4929]: I1002 13:45:10.424647 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:10 crc kubenswrapper[4929]: I1002 13:45:10.425759 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:11 crc kubenswrapper[4929]: I1002 13:45:11.482115 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x95dc" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="registry-server" probeResult="failure" output=< Oct 02 13:45:11 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:45:11 crc kubenswrapper[4929]: > Oct 02 13:45:14 crc kubenswrapper[4929]: I1002 13:45:14.736396 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:45:14 crc kubenswrapper[4929]: I1002 13:45:14.737381 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:45:20 crc kubenswrapper[4929]: I1002 13:45:20.485412 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:20 crc kubenswrapper[4929]: I1002 13:45:20.552433 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.347532 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.348895 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x95dc" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="registry-server" containerID="cri-o://5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345" gracePeriod=2 Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.841862 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.963359 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities\") pod \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.963427 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content\") pod \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.963683 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr9ft\" (UniqueName: \"kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft\") pod \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\" (UID: \"7121cbd7-7855-4bc5-86de-4cfbbbdeda56\") " Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.964249 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities" (OuterVolumeSpecName: "utilities") pod "7121cbd7-7855-4bc5-86de-4cfbbbdeda56" (UID: "7121cbd7-7855-4bc5-86de-4cfbbbdeda56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.965105 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:23 crc kubenswrapper[4929]: I1002 13:45:23.974791 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft" (OuterVolumeSpecName: "kube-api-access-tr9ft") pod "7121cbd7-7855-4bc5-86de-4cfbbbdeda56" (UID: "7121cbd7-7855-4bc5-86de-4cfbbbdeda56"). InnerVolumeSpecName "kube-api-access-tr9ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.046279 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7121cbd7-7855-4bc5-86de-4cfbbbdeda56" (UID: "7121cbd7-7855-4bc5-86de-4cfbbbdeda56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.067505 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr9ft\" (UniqueName: \"kubernetes.io/projected/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-kube-api-access-tr9ft\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.067542 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7121cbd7-7855-4bc5-86de-4cfbbbdeda56-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.419239 4929 generic.go:334] "Generic (PLEG): container finished" podID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerID="5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345" exitCode=0 Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.419290 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerDied","Data":"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345"} Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.419343 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x95dc" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.419360 4929 scope.go:117] "RemoveContainer" containerID="5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.419349 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x95dc" event={"ID":"7121cbd7-7855-4bc5-86de-4cfbbbdeda56","Type":"ContainerDied","Data":"ec5464b1d73c8b9481dcfe0dbbd1efe4d20a47f6f6c5f579c6e774163d8a374b"} Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.452434 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.463340 4929 scope.go:117] "RemoveContainer" containerID="cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.465720 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x95dc"] Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.488359 4929 scope.go:117] "RemoveContainer" containerID="b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.538106 4929 scope.go:117] "RemoveContainer" containerID="5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345" Oct 02 13:45:24 crc kubenswrapper[4929]: E1002 13:45:24.538759 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345\": container with ID starting with 5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345 not found: ID does not exist" containerID="5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.538788 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345"} err="failed to get container status \"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345\": rpc error: code = NotFound desc = could not find container \"5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345\": container with ID starting with 5df06c828060d17a580d385345bc49c4dd646aac33060974a1a979f6b255e345 not found: ID does not exist" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.538809 4929 scope.go:117] "RemoveContainer" containerID="cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4" Oct 02 13:45:24 crc kubenswrapper[4929]: E1002 13:45:24.539241 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4\": container with ID starting with cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4 not found: ID does not exist" containerID="cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.539290 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4"} err="failed to get container status \"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4\": rpc error: code = NotFound desc = could not find container \"cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4\": container with ID starting with cbc2601e02926c916e08881dc5bef5a92dd0e8aa1581ce6f066fd50ba810b0e4 not found: ID does not exist" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.539321 4929 scope.go:117] "RemoveContainer" containerID="b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b" Oct 02 13:45:24 crc kubenswrapper[4929]: E1002 13:45:24.539703 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b\": container with ID starting with b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b not found: ID does not exist" containerID="b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b" Oct 02 13:45:24 crc kubenswrapper[4929]: I1002 13:45:24.539764 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b"} err="failed to get container status \"b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b\": rpc error: code = NotFound desc = could not find container \"b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b\": container with ID starting with b5aafcfaf8c53dab09ade2cd3d831fb40d437df15473544636f84f1cfb53780b not found: ID does not exist" Oct 02 13:45:26 crc kubenswrapper[4929]: I1002 13:45:26.182251 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" path="/var/lib/kubelet/pods/7121cbd7-7855-4bc5-86de-4cfbbbdeda56/volumes" Oct 02 13:45:29 crc kubenswrapper[4929]: I1002 13:45:29.308196 4929 scope.go:117] "RemoveContainer" containerID="e46c693039690c00c623331a534fdbe193db84623fa2e1c5bf9c099e43a7947b" Oct 02 13:45:44 crc kubenswrapper[4929]: I1002 13:45:44.741857 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:45:44 crc kubenswrapper[4929]: I1002 13:45:44.742839 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:46:14 crc kubenswrapper[4929]: I1002 13:46:14.736585 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:46:14 crc kubenswrapper[4929]: I1002 13:46:14.738307 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:46:14 crc kubenswrapper[4929]: I1002 13:46:14.738473 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:46:14 crc kubenswrapper[4929]: I1002 13:46:14.739413 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:46:14 crc kubenswrapper[4929]: I1002 13:46:14.739553 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca" gracePeriod=600 Oct 02 13:46:15 crc kubenswrapper[4929]: I1002 13:46:15.102000 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca" exitCode=0 Oct 02 13:46:15 crc kubenswrapper[4929]: I1002 13:46:15.102350 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca"} Oct 02 13:46:15 crc kubenswrapper[4929]: I1002 13:46:15.102511 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47"} Oct 02 13:46:15 crc kubenswrapper[4929]: I1002 13:46:15.102550 4929 scope.go:117] "RemoveContainer" containerID="168a5dbf0ef98bee5fb66cbb7f791bf21bfed43fe58c4a4f69b35739e78c4d73" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.084637 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:23 crc kubenswrapper[4929]: E1002 13:47:23.085970 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="extract-utilities" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.085985 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="extract-utilities" Oct 02 13:47:23 crc kubenswrapper[4929]: E1002 13:47:23.086013 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" containerName="collect-profiles" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.086019 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" containerName="collect-profiles" Oct 02 13:47:23 crc kubenswrapper[4929]: E1002 13:47:23.086038 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="registry-server" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.086045 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="registry-server" Oct 02 13:47:23 crc kubenswrapper[4929]: E1002 13:47:23.086051 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="extract-content" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.086057 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="extract-content" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.086297 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7121cbd7-7855-4bc5-86de-4cfbbbdeda56" containerName="registry-server" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.086314 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb4e74f-d5e4-4e44-91fd-88fb3f66d574" containerName="collect-profiles" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.090594 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.099270 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.151866 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.152408 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.152508 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqtd\" (UniqueName: \"kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.255280 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.255367 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqtd\" (UniqueName: \"kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.255404 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.256063 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.256134 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.278187 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqtd\" (UniqueName: \"kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd\") pod \"community-operators-trssl\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:23 crc kubenswrapper[4929]: I1002 13:47:23.416308 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:24 crc kubenswrapper[4929]: I1002 13:47:24.503852 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:24 crc kubenswrapper[4929]: I1002 13:47:24.938221 4929 generic.go:334] "Generic (PLEG): container finished" podID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerID="14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97" exitCode=0 Oct 02 13:47:24 crc kubenswrapper[4929]: I1002 13:47:24.938275 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerDied","Data":"14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97"} Oct 02 13:47:24 crc kubenswrapper[4929]: I1002 13:47:24.938599 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerStarted","Data":"c73834b03eaf5b6f58652f7a34b10b623aaef650e038b758bdaf7efad5864ebd"} Oct 02 13:47:24 crc kubenswrapper[4929]: I1002 13:47:24.942076 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:47:26 crc kubenswrapper[4929]: I1002 13:47:26.964120 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerStarted","Data":"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702"} Oct 02 13:47:27 crc kubenswrapper[4929]: I1002 13:47:27.978919 4929 generic.go:334] "Generic (PLEG): container finished" podID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerID="692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702" exitCode=0 Oct 02 13:47:27 crc kubenswrapper[4929]: I1002 13:47:27.979028 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerDied","Data":"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702"} Oct 02 13:47:28 crc kubenswrapper[4929]: I1002 13:47:28.992851 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerStarted","Data":"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135"} Oct 02 13:47:29 crc kubenswrapper[4929]: I1002 13:47:29.022415 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-trssl" podStartSLOduration=2.457360268 podStartE2EDuration="6.022396339s" podCreationTimestamp="2025-10-02 13:47:23 +0000 UTC" firstStartedPulling="2025-10-02 13:47:24.941729882 +0000 UTC m=+9445.492096246" lastFinishedPulling="2025-10-02 13:47:28.506765953 +0000 UTC m=+9449.057132317" observedRunningTime="2025-10-02 13:47:29.014256934 +0000 UTC m=+9449.564623298" watchObservedRunningTime="2025-10-02 13:47:29.022396339 +0000 UTC m=+9449.572762703" Oct 02 13:47:33 crc kubenswrapper[4929]: I1002 13:47:33.417008 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:33 crc kubenswrapper[4929]: I1002 13:47:33.417772 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:33 crc kubenswrapper[4929]: I1002 13:47:33.491065 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:34 crc kubenswrapper[4929]: I1002 13:47:34.106912 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:34 crc kubenswrapper[4929]: I1002 13:47:34.175249 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.069856 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-trssl" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="registry-server" containerID="cri-o://01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135" gracePeriod=2 Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.607492 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.797169 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content\") pod \"d497abcf-df6a-4a62-86f8-e897a7683f7e\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.797525 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqtd\" (UniqueName: \"kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd\") pod \"d497abcf-df6a-4a62-86f8-e897a7683f7e\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.797561 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities\") pod \"d497abcf-df6a-4a62-86f8-e897a7683f7e\" (UID: \"d497abcf-df6a-4a62-86f8-e897a7683f7e\") " Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.798455 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities" (OuterVolumeSpecName: "utilities") pod "d497abcf-df6a-4a62-86f8-e897a7683f7e" (UID: "d497abcf-df6a-4a62-86f8-e897a7683f7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.803563 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd" (OuterVolumeSpecName: "kube-api-access-zsqtd") pod "d497abcf-df6a-4a62-86f8-e897a7683f7e" (UID: "d497abcf-df6a-4a62-86f8-e897a7683f7e"). InnerVolumeSpecName "kube-api-access-zsqtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.842867 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d497abcf-df6a-4a62-86f8-e897a7683f7e" (UID: "d497abcf-df6a-4a62-86f8-e897a7683f7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.900613 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsqtd\" (UniqueName: \"kubernetes.io/projected/d497abcf-df6a-4a62-86f8-e897a7683f7e-kube-api-access-zsqtd\") on node \"crc\" DevicePath \"\"" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.900662 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:47:36 crc kubenswrapper[4929]: I1002 13:47:36.900674 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d497abcf-df6a-4a62-86f8-e897a7683f7e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.082796 4929 generic.go:334] "Generic (PLEG): container finished" podID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerID="01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135" exitCode=0 Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.082857 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-trssl" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.082877 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerDied","Data":"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135"} Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.082914 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-trssl" event={"ID":"d497abcf-df6a-4a62-86f8-e897a7683f7e","Type":"ContainerDied","Data":"c73834b03eaf5b6f58652f7a34b10b623aaef650e038b758bdaf7efad5864ebd"} Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.082938 4929 scope.go:117] "RemoveContainer" containerID="01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.106819 4929 scope.go:117] "RemoveContainer" containerID="692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.125055 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.136330 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-trssl"] Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.141834 4929 scope.go:117] "RemoveContainer" containerID="14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.207747 4929 scope.go:117] "RemoveContainer" containerID="01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135" Oct 02 13:47:37 crc kubenswrapper[4929]: E1002 13:47:37.208356 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135\": container with ID starting with 01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135 not found: ID does not exist" containerID="01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.208454 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135"} err="failed to get container status \"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135\": rpc error: code = NotFound desc = could not find container \"01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135\": container with ID starting with 01fc0384aca911b6bbc5d6f1add54675da27cedc7d95f46dc003323f820c3135 not found: ID does not exist" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.208513 4929 scope.go:117] "RemoveContainer" containerID="692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702" Oct 02 13:47:37 crc kubenswrapper[4929]: E1002 13:47:37.208844 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702\": container with ID starting with 692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702 not found: ID does not exist" containerID="692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.208874 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702"} err="failed to get container status \"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702\": rpc error: code = NotFound desc = could not find container \"692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702\": container with ID starting with 692cf25aa9bfb5fb2cdcfdf3aee279b4049e55f4c333d4ed9ffd894d6df37702 not found: ID does not exist" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.208895 4929 scope.go:117] "RemoveContainer" containerID="14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97" Oct 02 13:47:37 crc kubenswrapper[4929]: E1002 13:47:37.209239 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97\": container with ID starting with 14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97 not found: ID does not exist" containerID="14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97" Oct 02 13:47:37 crc kubenswrapper[4929]: I1002 13:47:37.209275 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97"} err="failed to get container status \"14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97\": rpc error: code = NotFound desc = could not find container \"14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97\": container with ID starting with 14ca3b1e046cd93c763e647c58f4d4ac228cc8c9100d8c622fb7ff930c1cde97 not found: ID does not exist" Oct 02 13:47:38 crc kubenswrapper[4929]: I1002 13:47:38.176906 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" path="/var/lib/kubelet/pods/d497abcf-df6a-4a62-86f8-e897a7683f7e/volumes" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.675000 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:47:55 crc kubenswrapper[4929]: E1002 13:47:55.677718 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="extract-utilities" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.677852 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="extract-utilities" Oct 02 13:47:55 crc kubenswrapper[4929]: E1002 13:47:55.677978 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="registry-server" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.678045 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="registry-server" Oct 02 13:47:55 crc kubenswrapper[4929]: E1002 13:47:55.678160 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="extract-content" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.678220 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="extract-content" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.678515 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d497abcf-df6a-4a62-86f8-e897a7683f7e" containerName="registry-server" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.680368 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.688779 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.790986 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.791464 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986ks\" (UniqueName: \"kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.791649 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.893826 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.893879 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986ks\" (UniqueName: \"kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.893931 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.894403 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.894778 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:55 crc kubenswrapper[4929]: I1002 13:47:55.914318 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986ks\" (UniqueName: \"kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks\") pod \"redhat-marketplace-9tsml\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:56 crc kubenswrapper[4929]: I1002 13:47:56.041639 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:47:56 crc kubenswrapper[4929]: I1002 13:47:56.572615 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:47:57 crc kubenswrapper[4929]: I1002 13:47:57.371649 4929 generic.go:334] "Generic (PLEG): container finished" podID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerID="d57f814758bfdbad6a013e01041ed82ba5a93ce338044cb59bc2ad66e27ca757" exitCode=0 Oct 02 13:47:57 crc kubenswrapper[4929]: I1002 13:47:57.372549 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerDied","Data":"d57f814758bfdbad6a013e01041ed82ba5a93ce338044cb59bc2ad66e27ca757"} Oct 02 13:47:57 crc kubenswrapper[4929]: I1002 13:47:57.372587 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerStarted","Data":"41affe479606782c01706ed85ee3a8698c6b3395d9d785cc31d4e4001f291e8f"} Oct 02 13:47:58 crc kubenswrapper[4929]: I1002 13:47:58.388760 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerStarted","Data":"b53ed666815f79c571a9e5efd298bec64edd3e08d000f1e96ff8312c73dee248"} Oct 02 13:47:59 crc kubenswrapper[4929]: I1002 13:47:59.399375 4929 generic.go:334] "Generic (PLEG): container finished" podID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerID="b53ed666815f79c571a9e5efd298bec64edd3e08d000f1e96ff8312c73dee248" exitCode=0 Oct 02 13:47:59 crc kubenswrapper[4929]: I1002 13:47:59.399436 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerDied","Data":"b53ed666815f79c571a9e5efd298bec64edd3e08d000f1e96ff8312c73dee248"} Oct 02 13:48:00 crc kubenswrapper[4929]: I1002 13:48:00.410511 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerStarted","Data":"0062d336aa448ad8ba06b8ec7fcd12dd6588f38d19c95e5dddea956b0f4e8769"} Oct 02 13:48:00 crc kubenswrapper[4929]: I1002 13:48:00.440438 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9tsml" podStartSLOduration=2.733027448 podStartE2EDuration="5.440415137s" podCreationTimestamp="2025-10-02 13:47:55 +0000 UTC" firstStartedPulling="2025-10-02 13:47:57.375139864 +0000 UTC m=+9477.925506228" lastFinishedPulling="2025-10-02 13:48:00.082527553 +0000 UTC m=+9480.632893917" observedRunningTime="2025-10-02 13:48:00.427605627 +0000 UTC m=+9480.977972011" watchObservedRunningTime="2025-10-02 13:48:00.440415137 +0000 UTC m=+9480.990781501" Oct 02 13:48:06 crc kubenswrapper[4929]: I1002 13:48:06.042724 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:06 crc kubenswrapper[4929]: I1002 13:48:06.043390 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:06 crc kubenswrapper[4929]: I1002 13:48:06.122609 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:06 crc kubenswrapper[4929]: I1002 13:48:06.528144 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:06 crc kubenswrapper[4929]: I1002 13:48:06.575560 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:48:08 crc kubenswrapper[4929]: I1002 13:48:08.510916 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9tsml" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="registry-server" containerID="cri-o://0062d336aa448ad8ba06b8ec7fcd12dd6588f38d19c95e5dddea956b0f4e8769" gracePeriod=2 Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.530858 4929 generic.go:334] "Generic (PLEG): container finished" podID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerID="0062d336aa448ad8ba06b8ec7fcd12dd6588f38d19c95e5dddea956b0f4e8769" exitCode=0 Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.531076 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerDied","Data":"0062d336aa448ad8ba06b8ec7fcd12dd6588f38d19c95e5dddea956b0f4e8769"} Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.655457 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.749497 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content\") pod \"6b95034d-1dba-42d4-b143-3d5758f9fc76\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.749570 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities\") pod \"6b95034d-1dba-42d4-b143-3d5758f9fc76\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.749819 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986ks\" (UniqueName: \"kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks\") pod \"6b95034d-1dba-42d4-b143-3d5758f9fc76\" (UID: \"6b95034d-1dba-42d4-b143-3d5758f9fc76\") " Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.750889 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities" (OuterVolumeSpecName: "utilities") pod "6b95034d-1dba-42d4-b143-3d5758f9fc76" (UID: "6b95034d-1dba-42d4-b143-3d5758f9fc76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.756586 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks" (OuterVolumeSpecName: "kube-api-access-986ks") pod "6b95034d-1dba-42d4-b143-3d5758f9fc76" (UID: "6b95034d-1dba-42d4-b143-3d5758f9fc76"). InnerVolumeSpecName "kube-api-access-986ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.762459 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b95034d-1dba-42d4-b143-3d5758f9fc76" (UID: "6b95034d-1dba-42d4-b143-3d5758f9fc76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.852747 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986ks\" (UniqueName: \"kubernetes.io/projected/6b95034d-1dba-42d4-b143-3d5758f9fc76-kube-api-access-986ks\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.852780 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:09 crc kubenswrapper[4929]: I1002 13:48:09.852790 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b95034d-1dba-42d4-b143-3d5758f9fc76-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.550185 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tsml" event={"ID":"6b95034d-1dba-42d4-b143-3d5758f9fc76","Type":"ContainerDied","Data":"41affe479606782c01706ed85ee3a8698c6b3395d9d785cc31d4e4001f291e8f"} Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.550336 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tsml" Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.550650 4929 scope.go:117] "RemoveContainer" containerID="0062d336aa448ad8ba06b8ec7fcd12dd6588f38d19c95e5dddea956b0f4e8769" Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.585252 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.598400 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tsml"] Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.598808 4929 scope.go:117] "RemoveContainer" containerID="b53ed666815f79c571a9e5efd298bec64edd3e08d000f1e96ff8312c73dee248" Oct 02 13:48:10 crc kubenswrapper[4929]: I1002 13:48:10.629204 4929 scope.go:117] "RemoveContainer" containerID="d57f814758bfdbad6a013e01041ed82ba5a93ce338044cb59bc2ad66e27ca757" Oct 02 13:48:12 crc kubenswrapper[4929]: I1002 13:48:12.172915 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" path="/var/lib/kubelet/pods/6b95034d-1dba-42d4-b143-3d5758f9fc76/volumes" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.184113 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:23 crc kubenswrapper[4929]: E1002 13:48:23.185780 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="extract-content" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.185797 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="extract-content" Oct 02 13:48:23 crc kubenswrapper[4929]: E1002 13:48:23.185812 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="extract-utilities" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.185818 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="extract-utilities" Oct 02 13:48:23 crc kubenswrapper[4929]: E1002 13:48:23.185847 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="registry-server" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.185853 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="registry-server" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.186169 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b95034d-1dba-42d4-b143-3d5758f9fc76" containerName="registry-server" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.187832 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.199653 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.267562 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.268130 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.268373 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72xfb\" (UniqueName: \"kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.370564 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.370927 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72xfb\" (UniqueName: \"kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.371133 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.371186 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.371410 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.394082 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72xfb\" (UniqueName: \"kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb\") pod \"certified-operators-jjrtr\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:23 crc kubenswrapper[4929]: I1002 13:48:23.513105 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:24 crc kubenswrapper[4929]: I1002 13:48:24.071351 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:24 crc kubenswrapper[4929]: I1002 13:48:24.725274 4929 generic.go:334] "Generic (PLEG): container finished" podID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerID="aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0" exitCode=0 Oct 02 13:48:24 crc kubenswrapper[4929]: I1002 13:48:24.725673 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerDied","Data":"aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0"} Oct 02 13:48:24 crc kubenswrapper[4929]: I1002 13:48:24.725737 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerStarted","Data":"134fd786fefd88ee2c77944e14826dbac86042a0478ed3ec97a9c5b9243a192c"} Oct 02 13:48:26 crc kubenswrapper[4929]: I1002 13:48:26.753561 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerStarted","Data":"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb"} Oct 02 13:48:27 crc kubenswrapper[4929]: I1002 13:48:27.763256 4929 generic.go:334] "Generic (PLEG): container finished" podID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerID="9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb" exitCode=0 Oct 02 13:48:27 crc kubenswrapper[4929]: I1002 13:48:27.763693 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerDied","Data":"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb"} Oct 02 13:48:28 crc kubenswrapper[4929]: I1002 13:48:28.778899 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerStarted","Data":"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e"} Oct 02 13:48:28 crc kubenswrapper[4929]: I1002 13:48:28.814210 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jjrtr" podStartSLOduration=2.308248973 podStartE2EDuration="5.814190109s" podCreationTimestamp="2025-10-02 13:48:23 +0000 UTC" firstStartedPulling="2025-10-02 13:48:24.728655331 +0000 UTC m=+9505.279021695" lastFinishedPulling="2025-10-02 13:48:28.234596467 +0000 UTC m=+9508.784962831" observedRunningTime="2025-10-02 13:48:28.805611721 +0000 UTC m=+9509.355978105" watchObservedRunningTime="2025-10-02 13:48:28.814190109 +0000 UTC m=+9509.364556473" Oct 02 13:48:33 crc kubenswrapper[4929]: I1002 13:48:33.514387 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:33 crc kubenswrapper[4929]: I1002 13:48:33.515107 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:33 crc kubenswrapper[4929]: I1002 13:48:33.578176 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:33 crc kubenswrapper[4929]: I1002 13:48:33.918416 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:33 crc kubenswrapper[4929]: I1002 13:48:33.974771 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:35 crc kubenswrapper[4929]: I1002 13:48:35.876865 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jjrtr" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="registry-server" containerID="cri-o://9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e" gracePeriod=2 Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.442407 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.513834 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72xfb\" (UniqueName: \"kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb\") pod \"1ce899ce-a187-4dcc-a964-964718c4eb07\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.514412 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content\") pod \"1ce899ce-a187-4dcc-a964-964718c4eb07\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.514500 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities\") pod \"1ce899ce-a187-4dcc-a964-964718c4eb07\" (UID: \"1ce899ce-a187-4dcc-a964-964718c4eb07\") " Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.517448 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities" (OuterVolumeSpecName: "utilities") pod "1ce899ce-a187-4dcc-a964-964718c4eb07" (UID: "1ce899ce-a187-4dcc-a964-964718c4eb07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.529260 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb" (OuterVolumeSpecName: "kube-api-access-72xfb") pod "1ce899ce-a187-4dcc-a964-964718c4eb07" (UID: "1ce899ce-a187-4dcc-a964-964718c4eb07"). InnerVolumeSpecName "kube-api-access-72xfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.618027 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.618059 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72xfb\" (UniqueName: \"kubernetes.io/projected/1ce899ce-a187-4dcc-a964-964718c4eb07-kube-api-access-72xfb\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.898203 4929 generic.go:334] "Generic (PLEG): container finished" podID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerID="9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e" exitCode=0 Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.898271 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerDied","Data":"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e"} Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.898328 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jjrtr" event={"ID":"1ce899ce-a187-4dcc-a964-964718c4eb07","Type":"ContainerDied","Data":"134fd786fefd88ee2c77944e14826dbac86042a0478ed3ec97a9c5b9243a192c"} Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.898358 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jjrtr" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.898373 4929 scope.go:117] "RemoveContainer" containerID="9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.927603 4929 scope.go:117] "RemoveContainer" containerID="9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb" Oct 02 13:48:36 crc kubenswrapper[4929]: I1002 13:48:36.958901 4929 scope.go:117] "RemoveContainer" containerID="aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.008514 4929 scope.go:117] "RemoveContainer" containerID="9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e" Oct 02 13:48:37 crc kubenswrapper[4929]: E1002 13:48:37.009378 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e\": container with ID starting with 9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e not found: ID does not exist" containerID="9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.009431 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e"} err="failed to get container status \"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e\": rpc error: code = NotFound desc = could not find container \"9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e\": container with ID starting with 9012ef2be5b0466aafd657f6a249519d925f1cf5dd1fb59d6243d8bf86c6818e not found: ID does not exist" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.009459 4929 scope.go:117] "RemoveContainer" containerID="9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb" Oct 02 13:48:37 crc kubenswrapper[4929]: E1002 13:48:37.010112 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb\": container with ID starting with 9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb not found: ID does not exist" containerID="9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.010169 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb"} err="failed to get container status \"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb\": rpc error: code = NotFound desc = could not find container \"9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb\": container with ID starting with 9a745c49136033640db46cb0c6d32feedf10989723abc679ee35b75f1b6c46cb not found: ID does not exist" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.010209 4929 scope.go:117] "RemoveContainer" containerID="aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0" Oct 02 13:48:37 crc kubenswrapper[4929]: E1002 13:48:37.010679 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0\": container with ID starting with aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0 not found: ID does not exist" containerID="aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.010716 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0"} err="failed to get container status \"aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0\": rpc error: code = NotFound desc = could not find container \"aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0\": container with ID starting with aa5d857a1bdda2459aaad0e3ca4c8376e24c954d4a2a2eb4b2e128ab3aae53e0 not found: ID does not exist" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.265805 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ce899ce-a187-4dcc-a964-964718c4eb07" (UID: "1ce899ce-a187-4dcc-a964-964718c4eb07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.337097 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce899ce-a187-4dcc-a964-964718c4eb07-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.554125 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:37 crc kubenswrapper[4929]: I1002 13:48:37.566665 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jjrtr"] Oct 02 13:48:38 crc kubenswrapper[4929]: I1002 13:48:38.174102 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" path="/var/lib/kubelet/pods/1ce899ce-a187-4dcc-a964-964718c4eb07/volumes" Oct 02 13:48:44 crc kubenswrapper[4929]: I1002 13:48:44.736356 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:48:44 crc kubenswrapper[4929]: I1002 13:48:44.737006 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:49:14 crc kubenswrapper[4929]: I1002 13:49:14.737290 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:49:14 crc kubenswrapper[4929]: I1002 13:49:14.737995 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:49:44 crc kubenswrapper[4929]: I1002 13:49:44.736675 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:49:44 crc kubenswrapper[4929]: I1002 13:49:44.737545 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:49:44 crc kubenswrapper[4929]: I1002 13:49:44.737600 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:49:44 crc kubenswrapper[4929]: I1002 13:49:44.738579 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:49:44 crc kubenswrapper[4929]: I1002 13:49:44.738638 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" gracePeriod=600 Oct 02 13:49:44 crc kubenswrapper[4929]: E1002 13:49:44.861205 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:49:45 crc kubenswrapper[4929]: I1002 13:49:45.661994 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47"} Oct 02 13:49:45 crc kubenswrapper[4929]: I1002 13:49:45.662381 4929 scope.go:117] "RemoveContainer" containerID="d63c3fd8bc48c58dbeed7c4a6c013d4dbe6685c805ca112a89abba0b4689b1ca" Oct 02 13:49:45 crc kubenswrapper[4929]: I1002 13:49:45.662024 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" exitCode=0 Oct 02 13:49:45 crc kubenswrapper[4929]: I1002 13:49:45.663534 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:49:45 crc kubenswrapper[4929]: E1002 13:49:45.664262 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:49:57 crc kubenswrapper[4929]: I1002 13:49:57.156645 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:49:57 crc kubenswrapper[4929]: E1002 13:49:57.157624 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:50:11 crc kubenswrapper[4929]: I1002 13:50:11.157255 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:50:11 crc kubenswrapper[4929]: E1002 13:50:11.158296 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:50:24 crc kubenswrapper[4929]: I1002 13:50:24.156771 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:50:24 crc kubenswrapper[4929]: E1002 13:50:24.157647 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:50:35 crc kubenswrapper[4929]: I1002 13:50:35.157408 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:50:35 crc kubenswrapper[4929]: E1002 13:50:35.159313 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:50:50 crc kubenswrapper[4929]: I1002 13:50:50.166996 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:50:50 crc kubenswrapper[4929]: E1002 13:50:50.168002 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:51:05 crc kubenswrapper[4929]: I1002 13:51:05.156538 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:51:05 crc kubenswrapper[4929]: E1002 13:51:05.157479 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:51:18 crc kubenswrapper[4929]: I1002 13:51:18.159860 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:51:18 crc kubenswrapper[4929]: E1002 13:51:18.160763 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:51:30 crc kubenswrapper[4929]: I1002 13:51:30.162917 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:51:30 crc kubenswrapper[4929]: E1002 13:51:30.164121 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:51:44 crc kubenswrapper[4929]: I1002 13:51:44.157343 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:51:44 crc kubenswrapper[4929]: E1002 13:51:44.158433 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:51:55 crc kubenswrapper[4929]: I1002 13:51:55.157251 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:51:55 crc kubenswrapper[4929]: E1002 13:51:55.158507 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:52:10 crc kubenswrapper[4929]: I1002 13:52:10.164105 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:52:10 crc kubenswrapper[4929]: E1002 13:52:10.164998 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:52:25 crc kubenswrapper[4929]: I1002 13:52:25.156846 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:52:25 crc kubenswrapper[4929]: E1002 13:52:25.157749 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:52:40 crc kubenswrapper[4929]: I1002 13:52:40.169731 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:52:40 crc kubenswrapper[4929]: E1002 13:52:40.170582 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:52:52 crc kubenswrapper[4929]: I1002 13:52:52.162192 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:52:52 crc kubenswrapper[4929]: E1002 13:52:52.163031 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:53:05 crc kubenswrapper[4929]: I1002 13:53:05.156474 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:53:05 crc kubenswrapper[4929]: E1002 13:53:05.157303 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:53:20 crc kubenswrapper[4929]: I1002 13:53:20.166950 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:53:20 crc kubenswrapper[4929]: E1002 13:53:20.167845 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:53:25 crc kubenswrapper[4929]: I1002 13:53:25.013944 4929 generic.go:334] "Generic (PLEG): container finished" podID="2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" containerID="d7b1db0064c088b3382169bc6ebcc36c80a722a99d1a2da1d3f4e7ab07386b71" exitCode=0 Oct 02 13:53:25 crc kubenswrapper[4929]: I1002 13:53:25.014035 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" event={"ID":"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241","Type":"ContainerDied","Data":"d7b1db0064c088b3382169bc6ebcc36c80a722a99d1a2da1d3f4e7ab07386b71"} Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.538490 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.674675 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.674735 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.674852 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.674886 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.674952 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675002 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v9s9\" (UniqueName: \"kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675036 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675082 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675118 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675144 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.675166 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0\") pod \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\" (UID: \"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241\") " Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.680875 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.680912 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph" (OuterVolumeSpecName: "ceph") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.683512 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9" (OuterVolumeSpecName: "kube-api-access-7v9s9") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "kube-api-access-7v9s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.706073 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory" (OuterVolumeSpecName: "inventory") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.706708 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.709897 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.712484 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.712670 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.715093 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.721459 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.731291 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" (UID: "2e41a2c6-d1f1-45c5-92a6-f7cf301ad241"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.777508 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.777947 4929 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778035 4929 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778052 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778060 4929 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778069 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v9s9\" (UniqueName: \"kubernetes.io/projected/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-kube-api-access-7v9s9\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778078 4929 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778085 4929 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778095 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778104 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:26 crc kubenswrapper[4929]: I1002 13:53:26.778113 4929 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2e41a2c6-d1f1-45c5-92a6-f7cf301ad241-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:53:27 crc kubenswrapper[4929]: I1002 13:53:27.046651 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" event={"ID":"2e41a2c6-d1f1-45c5-92a6-f7cf301ad241","Type":"ContainerDied","Data":"0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df"} Oct 02 13:53:27 crc kubenswrapper[4929]: I1002 13:53:27.046711 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dad4b28e7dde1d306aeb5326f2eccce9e2462f1691eddb85358c37b35ebe9df" Oct 02 13:53:27 crc kubenswrapper[4929]: I1002 13:53:27.046738 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98" Oct 02 13:53:34 crc kubenswrapper[4929]: I1002 13:53:34.158743 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:53:34 crc kubenswrapper[4929]: E1002 13:53:34.159511 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:53:49 crc kubenswrapper[4929]: I1002 13:53:49.157054 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:53:49 crc kubenswrapper[4929]: E1002 13:53:49.157930 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:54:01 crc kubenswrapper[4929]: I1002 13:54:01.164431 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:54:01 crc kubenswrapper[4929]: E1002 13:54:01.165638 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:54:15 crc kubenswrapper[4929]: I1002 13:54:15.157321 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:54:15 crc kubenswrapper[4929]: E1002 13:54:15.158820 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:54:26 crc kubenswrapper[4929]: I1002 13:54:26.158336 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:54:26 crc kubenswrapper[4929]: E1002 13:54:26.159860 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:54:37 crc kubenswrapper[4929]: I1002 13:54:37.157133 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:54:37 crc kubenswrapper[4929]: E1002 13:54:37.159454 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 13:54:52 crc kubenswrapper[4929]: I1002 13:54:52.157416 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:54:53 crc kubenswrapper[4929]: I1002 13:54:53.018471 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d"} Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.319115 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:20 crc kubenswrapper[4929]: E1002 13:55:20.320505 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="extract-utilities" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320522 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="extract-utilities" Oct 02 13:55:20 crc kubenswrapper[4929]: E1002 13:55:20.320552 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="registry-server" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320574 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="registry-server" Oct 02 13:55:20 crc kubenswrapper[4929]: E1002 13:55:20.320593 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320602 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 02 13:55:20 crc kubenswrapper[4929]: E1002 13:55:20.320621 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="extract-content" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320626 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="extract-content" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320818 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e41a2c6-d1f1-45c5-92a6-f7cf301ad241" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.320853 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce899ce-a187-4dcc-a964-964718c4eb07" containerName="registry-server" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.344452 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.344639 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.502024 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnfc\" (UniqueName: \"kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.502153 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.502293 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.604303 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.604468 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnfc\" (UniqueName: \"kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.604556 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.604995 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.605150 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.635813 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnfc\" (UniqueName: \"kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc\") pod \"redhat-operators-p6766\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:20 crc kubenswrapper[4929]: I1002 13:55:20.670262 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:21 crc kubenswrapper[4929]: I1002 13:55:21.224406 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:21 crc kubenswrapper[4929]: I1002 13:55:21.328760 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerStarted","Data":"177c0a50a7054f8a1168606d308d3e76d4bf63cab4b32db2ef5e6ac050c60490"} Oct 02 13:55:22 crc kubenswrapper[4929]: I1002 13:55:22.339667 4929 generic.go:334] "Generic (PLEG): container finished" podID="c29de723-4461-4edf-83ad-37ef591bd94b" containerID="b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc" exitCode=0 Oct 02 13:55:22 crc kubenswrapper[4929]: I1002 13:55:22.339728 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerDied","Data":"b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc"} Oct 02 13:55:22 crc kubenswrapper[4929]: I1002 13:55:22.343871 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:55:24 crc kubenswrapper[4929]: I1002 13:55:24.361903 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerStarted","Data":"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7"} Oct 02 13:55:29 crc kubenswrapper[4929]: I1002 13:55:29.421008 4929 generic.go:334] "Generic (PLEG): container finished" podID="c29de723-4461-4edf-83ad-37ef591bd94b" containerID="75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7" exitCode=0 Oct 02 13:55:29 crc kubenswrapper[4929]: I1002 13:55:29.421111 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerDied","Data":"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7"} Oct 02 13:55:31 crc kubenswrapper[4929]: I1002 13:55:31.444513 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerStarted","Data":"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528"} Oct 02 13:55:31 crc kubenswrapper[4929]: I1002 13:55:31.469823 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6766" podStartSLOduration=2.938256442 podStartE2EDuration="11.469801697s" podCreationTimestamp="2025-10-02 13:55:20 +0000 UTC" firstStartedPulling="2025-10-02 13:55:22.343597317 +0000 UTC m=+9922.893963681" lastFinishedPulling="2025-10-02 13:55:30.875142572 +0000 UTC m=+9931.425508936" observedRunningTime="2025-10-02 13:55:31.46574321 +0000 UTC m=+9932.016109574" watchObservedRunningTime="2025-10-02 13:55:31.469801697 +0000 UTC m=+9932.020168051" Oct 02 13:55:40 crc kubenswrapper[4929]: I1002 13:55:40.670487 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:40 crc kubenswrapper[4929]: I1002 13:55:40.670887 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:41 crc kubenswrapper[4929]: I1002 13:55:41.672194 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 13:55:41 crc kubenswrapper[4929]: I1002 13:55:41.673113 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" containerName="adoption" containerID="cri-o://80247cfb8ea20af64baec70686dd3ab20b2ad1b8f59c3081cd79d61d31387831" gracePeriod=30 Oct 02 13:55:41 crc kubenswrapper[4929]: I1002 13:55:41.716270 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6766" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="registry-server" probeResult="failure" output=< Oct 02 13:55:41 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 13:55:41 crc kubenswrapper[4929]: > Oct 02 13:55:50 crc kubenswrapper[4929]: I1002 13:55:50.717352 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:50 crc kubenswrapper[4929]: I1002 13:55:50.773028 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:51 crc kubenswrapper[4929]: I1002 13:55:51.514145 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:52 crc kubenswrapper[4929]: I1002 13:55:52.653883 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6766" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="registry-server" containerID="cri-o://f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528" gracePeriod=2 Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.194680 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.338363 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content\") pod \"c29de723-4461-4edf-83ad-37ef591bd94b\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.338464 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities\") pod \"c29de723-4461-4edf-83ad-37ef591bd94b\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.338624 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnfc\" (UniqueName: \"kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc\") pod \"c29de723-4461-4edf-83ad-37ef591bd94b\" (UID: \"c29de723-4461-4edf-83ad-37ef591bd94b\") " Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.340204 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities" (OuterVolumeSpecName: "utilities") pod "c29de723-4461-4edf-83ad-37ef591bd94b" (UID: "c29de723-4461-4edf-83ad-37ef591bd94b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.346283 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc" (OuterVolumeSpecName: "kube-api-access-svnfc") pod "c29de723-4461-4edf-83ad-37ef591bd94b" (UID: "c29de723-4461-4edf-83ad-37ef591bd94b"). InnerVolumeSpecName "kube-api-access-svnfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.432499 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c29de723-4461-4edf-83ad-37ef591bd94b" (UID: "c29de723-4461-4edf-83ad-37ef591bd94b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.441321 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svnfc\" (UniqueName: \"kubernetes.io/projected/c29de723-4461-4edf-83ad-37ef591bd94b-kube-api-access-svnfc\") on node \"crc\" DevicePath \"\"" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.441369 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.441382 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c29de723-4461-4edf-83ad-37ef591bd94b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.667950 4929 generic.go:334] "Generic (PLEG): container finished" podID="c29de723-4461-4edf-83ad-37ef591bd94b" containerID="f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528" exitCode=0 Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.668034 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerDied","Data":"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528"} Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.668138 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6766" event={"ID":"c29de723-4461-4edf-83ad-37ef591bd94b","Type":"ContainerDied","Data":"177c0a50a7054f8a1168606d308d3e76d4bf63cab4b32db2ef5e6ac050c60490"} Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.668181 4929 scope.go:117] "RemoveContainer" containerID="f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.669157 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6766" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.692089 4929 scope.go:117] "RemoveContainer" containerID="75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7" Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.714867 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:53 crc kubenswrapper[4929]: I1002 13:55:53.727598 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6766"] Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.171884 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" path="/var/lib/kubelet/pods/c29de723-4461-4edf-83ad-37ef591bd94b/volumes" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.223846 4929 scope.go:117] "RemoveContainer" containerID="b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.253169 4929 scope.go:117] "RemoveContainer" containerID="f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528" Oct 02 13:55:54 crc kubenswrapper[4929]: E1002 13:55:54.253775 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528\": container with ID starting with f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528 not found: ID does not exist" containerID="f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.253806 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528"} err="failed to get container status \"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528\": rpc error: code = NotFound desc = could not find container \"f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528\": container with ID starting with f195a476a9810e1b38259fae36881c7e14df9795d848619ec85c66315996d528 not found: ID does not exist" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.253828 4929 scope.go:117] "RemoveContainer" containerID="75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7" Oct 02 13:55:54 crc kubenswrapper[4929]: E1002 13:55:54.254273 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7\": container with ID starting with 75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7 not found: ID does not exist" containerID="75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.254295 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7"} err="failed to get container status \"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7\": rpc error: code = NotFound desc = could not find container \"75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7\": container with ID starting with 75d77bf5ff6dc80f929fd68f5e46aa25a96c29d4b230704158b6d7688c4959b7 not found: ID does not exist" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.254310 4929 scope.go:117] "RemoveContainer" containerID="b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc" Oct 02 13:55:54 crc kubenswrapper[4929]: E1002 13:55:54.254619 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc\": container with ID starting with b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc not found: ID does not exist" containerID="b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc" Oct 02 13:55:54 crc kubenswrapper[4929]: I1002 13:55:54.254642 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc"} err="failed to get container status \"b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc\": rpc error: code = NotFound desc = could not find container \"b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc\": container with ID starting with b43478798bf12ad94cc8a3aa3fe71b2256cc65c572b16a7039b8e5772f9c7fdc not found: ID does not exist" Oct 02 13:56:11 crc kubenswrapper[4929]: I1002 13:56:11.849605 4929 generic.go:334] "Generic (PLEG): container finished" podID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" containerID="80247cfb8ea20af64baec70686dd3ab20b2ad1b8f59c3081cd79d61d31387831" exitCode=137 Oct 02 13:56:11 crc kubenswrapper[4929]: I1002 13:56:11.849855 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"15b5b57e-78a6-41a3-baed-a92c20bb06dd","Type":"ContainerDied","Data":"80247cfb8ea20af64baec70686dd3ab20b2ad1b8f59c3081cd79d61d31387831"} Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.278428 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.366950 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") pod \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.367105 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7k46\" (UniqueName: \"kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46\") pod \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\" (UID: \"15b5b57e-78a6-41a3-baed-a92c20bb06dd\") " Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.374083 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46" (OuterVolumeSpecName: "kube-api-access-t7k46") pod "15b5b57e-78a6-41a3-baed-a92c20bb06dd" (UID: "15b5b57e-78a6-41a3-baed-a92c20bb06dd"). InnerVolumeSpecName "kube-api-access-t7k46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.397378 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186" (OuterVolumeSpecName: "mariadb-data") pod "15b5b57e-78a6-41a3-baed-a92c20bb06dd" (UID: "15b5b57e-78a6-41a3-baed-a92c20bb06dd"). InnerVolumeSpecName "pvc-084da11a-fef6-4192-b401-c3724c38f186". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.473097 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") on node \"crc\" " Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.473144 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7k46\" (UniqueName: \"kubernetes.io/projected/15b5b57e-78a6-41a3-baed-a92c20bb06dd-kube-api-access-t7k46\") on node \"crc\" DevicePath \"\"" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.551043 4929 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.551425 4929 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-084da11a-fef6-4192-b401-c3724c38f186" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186") on node "crc" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.575459 4929 reconciler_common.go:293] "Volume detached for volume \"pvc-084da11a-fef6-4192-b401-c3724c38f186\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-084da11a-fef6-4192-b401-c3724c38f186\") on node \"crc\" DevicePath \"\"" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.862474 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"15b5b57e-78a6-41a3-baed-a92c20bb06dd","Type":"ContainerDied","Data":"caa0dda9a9188d47a6e476d42c9e66addc22dc8020a7acf08721711163c1ff55"} Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.862520 4929 scope.go:117] "RemoveContainer" containerID="80247cfb8ea20af64baec70686dd3ab20b2ad1b8f59c3081cd79d61d31387831" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.863625 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.899091 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 13:56:12 crc kubenswrapper[4929]: I1002 13:56:12.909138 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 13:56:13 crc kubenswrapper[4929]: I1002 13:56:13.591695 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 13:56:13 crc kubenswrapper[4929]: I1002 13:56:13.591934 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" containerName="adoption" containerID="cri-o://46f5c01f67a15f7f058e412b7479965a4dcd2ef2221ebbf3be3e34db9c64577d" gracePeriod=30 Oct 02 13:56:14 crc kubenswrapper[4929]: I1002 13:56:14.180595 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" path="/var/lib/kubelet/pods/15b5b57e-78a6-41a3-baed-a92c20bb06dd/volumes" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.193420 4929 generic.go:334] "Generic (PLEG): container finished" podID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" containerID="46f5c01f67a15f7f058e412b7479965a4dcd2ef2221ebbf3be3e34db9c64577d" exitCode=137 Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.193598 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"598bbc9b-7eee-41ca-9078-7edd3464e2f9","Type":"ContainerDied","Data":"46f5c01f67a15f7f058e412b7479965a4dcd2ef2221ebbf3be3e34db9c64577d"} Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.536739 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.715408 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") pod \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.715548 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j56ws\" (UniqueName: \"kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws\") pod \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.715669 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert\") pod \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\" (UID: \"598bbc9b-7eee-41ca-9078-7edd3464e2f9\") " Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.724300 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "598bbc9b-7eee-41ca-9078-7edd3464e2f9" (UID: "598bbc9b-7eee-41ca-9078-7edd3464e2f9"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.724390 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws" (OuterVolumeSpecName: "kube-api-access-j56ws") pod "598bbc9b-7eee-41ca-9078-7edd3464e2f9" (UID: "598bbc9b-7eee-41ca-9078-7edd3464e2f9"). InnerVolumeSpecName "kube-api-access-j56ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.743156 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9" (OuterVolumeSpecName: "ovn-data") pod "598bbc9b-7eee-41ca-9078-7edd3464e2f9" (UID: "598bbc9b-7eee-41ca-9078-7edd3464e2f9"). InnerVolumeSpecName "pvc-223d51a7-9b46-46d0-91db-be146b2741a9". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.818618 4929 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") on node \"crc\" " Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.818675 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j56ws\" (UniqueName: \"kubernetes.io/projected/598bbc9b-7eee-41ca-9078-7edd3464e2f9-kube-api-access-j56ws\") on node \"crc\" DevicePath \"\"" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.818692 4929 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/598bbc9b-7eee-41ca-9078-7edd3464e2f9-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.868854 4929 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.869041 4929 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-223d51a7-9b46-46d0-91db-be146b2741a9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9") on node "crc" Oct 02 13:56:44 crc kubenswrapper[4929]: I1002 13:56:44.920320 4929 reconciler_common.go:293] "Volume detached for volume \"pvc-223d51a7-9b46-46d0-91db-be146b2741a9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-223d51a7-9b46-46d0-91db-be146b2741a9\") on node \"crc\" DevicePath \"\"" Oct 02 13:56:45 crc kubenswrapper[4929]: I1002 13:56:45.205851 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"598bbc9b-7eee-41ca-9078-7edd3464e2f9","Type":"ContainerDied","Data":"045bc22596b7e9773f74b1632387bdb6d0f6dc0e8a8baf0f82209f7dd291aeea"} Oct 02 13:56:45 crc kubenswrapper[4929]: I1002 13:56:45.205904 4929 scope.go:117] "RemoveContainer" containerID="46f5c01f67a15f7f058e412b7479965a4dcd2ef2221ebbf3be3e34db9c64577d" Oct 02 13:56:45 crc kubenswrapper[4929]: I1002 13:56:45.205938 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 13:56:45 crc kubenswrapper[4929]: I1002 13:56:45.259141 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 13:56:45 crc kubenswrapper[4929]: I1002 13:56:45.286572 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 13:56:46 crc kubenswrapper[4929]: I1002 13:56:46.180687 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" path="/var/lib/kubelet/pods/598bbc9b-7eee-41ca-9078-7edd3464e2f9/volumes" Oct 02 13:57:14 crc kubenswrapper[4929]: I1002 13:57:14.737207 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:57:14 crc kubenswrapper[4929]: I1002 13:57:14.737670 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:57:44 crc kubenswrapper[4929]: I1002 13:57:44.737017 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:57:44 crc kubenswrapper[4929]: I1002 13:57:44.737725 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.018724 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6j7s/must-gather-wwcqc"] Oct 02 13:57:56 crc kubenswrapper[4929]: E1002 13:57:56.019758 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="extract-utilities" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.019771 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="extract-utilities" Oct 02 13:57:56 crc kubenswrapper[4929]: E1002 13:57:56.019783 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="registry-server" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.019791 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="registry-server" Oct 02 13:57:56 crc kubenswrapper[4929]: E1002 13:57:56.019813 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="extract-content" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.019820 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="extract-content" Oct 02 13:57:56 crc kubenswrapper[4929]: E1002 13:57:56.019838 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.019844 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: E1002 13:57:56.019870 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.019876 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.020072 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29de723-4461-4edf-83ad-37ef591bd94b" containerName="registry-server" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.020096 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b5b57e-78a6-41a3-baed-a92c20bb06dd" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.020106 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="598bbc9b-7eee-41ca-9078-7edd3464e2f9" containerName="adoption" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.021227 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.034974 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6j7s"/"openshift-service-ca.crt" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.038615 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6j7s"/"kube-root-ca.crt" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.049812 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6j7s/must-gather-wwcqc"] Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.059445 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.059797 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbp5\" (UniqueName: \"kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.161909 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.162090 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbp5\" (UniqueName: \"kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.162621 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.217195 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbp5\" (UniqueName: \"kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5\") pod \"must-gather-wwcqc\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.310893 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.313909 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.336075 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.359656 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.469924 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.469996 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btxqb\" (UniqueName: \"kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.470151 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.572091 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.572146 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btxqb\" (UniqueName: \"kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.572176 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.572639 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:56 crc kubenswrapper[4929]: I1002 13:57:56.572665 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:57 crc kubenswrapper[4929]: I1002 13:57:57.095161 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btxqb\" (UniqueName: \"kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb\") pod \"community-operators-dh6th\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:57 crc kubenswrapper[4929]: I1002 13:57:57.238731 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:57:57 crc kubenswrapper[4929]: I1002 13:57:57.890990 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6j7s/must-gather-wwcqc"] Oct 02 13:57:57 crc kubenswrapper[4929]: I1002 13:57:57.959534 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" event={"ID":"ff029149-4944-4650-9351-69aa7531e9a6","Type":"ContainerStarted","Data":"6553edfc9f2e984ac849255e349fa5fdc0d031171cec74aaf1020da009d1539d"} Oct 02 13:57:58 crc kubenswrapper[4929]: W1002 13:57:58.155214 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b59ffc9_2256_4b82_a4cb_7eff70ad1ac3.slice/crio-7279892e116c7d6ebc8f4e3ea7f5c2362609c10caccd8ab91536d9d5f33ce422 WatchSource:0}: Error finding container 7279892e116c7d6ebc8f4e3ea7f5c2362609c10caccd8ab91536d9d5f33ce422: Status 404 returned error can't find the container with id 7279892e116c7d6ebc8f4e3ea7f5c2362609c10caccd8ab91536d9d5f33ce422 Oct 02 13:57:58 crc kubenswrapper[4929]: I1002 13:57:58.178387 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:57:58 crc kubenswrapper[4929]: I1002 13:57:58.973976 4929 generic.go:334] "Generic (PLEG): container finished" podID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerID="e1f584b9441ac0dc4187c1a7d3a36b247f03db412664ab5ac1d8f681c302a47c" exitCode=0 Oct 02 13:57:58 crc kubenswrapper[4929]: I1002 13:57:58.974251 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerDied","Data":"e1f584b9441ac0dc4187c1a7d3a36b247f03db412664ab5ac1d8f681c302a47c"} Oct 02 13:57:58 crc kubenswrapper[4929]: I1002 13:57:58.974357 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerStarted","Data":"7279892e116c7d6ebc8f4e3ea7f5c2362609c10caccd8ab91536d9d5f33ce422"} Oct 02 13:58:01 crc kubenswrapper[4929]: I1002 13:58:01.002692 4929 generic.go:334] "Generic (PLEG): container finished" podID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerID="4dcdf5fd655fbe260a2ebeefb95488687a9368cfdbb122d4bbc989a52f6ae2cd" exitCode=0 Oct 02 13:58:01 crc kubenswrapper[4929]: I1002 13:58:01.003219 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerDied","Data":"4dcdf5fd655fbe260a2ebeefb95488687a9368cfdbb122d4bbc989a52f6ae2cd"} Oct 02 13:58:09 crc kubenswrapper[4929]: I1002 13:58:09.119156 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerStarted","Data":"b4ceff1fdb1d74cd008671b7dd784aea63073340f8a8c93f881db1e8eae83887"} Oct 02 13:58:09 crc kubenswrapper[4929]: I1002 13:58:09.127256 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" event={"ID":"ff029149-4944-4650-9351-69aa7531e9a6","Type":"ContainerStarted","Data":"88aa457f10c0f60757fe1ae770b191cb525348b7d9c6cb54008e5fd47787175c"} Oct 02 13:58:09 crc kubenswrapper[4929]: I1002 13:58:09.127301 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" event={"ID":"ff029149-4944-4650-9351-69aa7531e9a6","Type":"ContainerStarted","Data":"5b2a31ae770dc6273b7f8a17f720ab66d7233f59d21285c19adce21563bc8931"} Oct 02 13:58:09 crc kubenswrapper[4929]: I1002 13:58:09.143017 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dh6th" podStartSLOduration=4.056626637 podStartE2EDuration="13.142996349s" podCreationTimestamp="2025-10-02 13:57:56 +0000 UTC" firstStartedPulling="2025-10-02 13:57:58.980015198 +0000 UTC m=+10079.530381572" lastFinishedPulling="2025-10-02 13:58:08.06638492 +0000 UTC m=+10088.616751284" observedRunningTime="2025-10-02 13:58:09.134838634 +0000 UTC m=+10089.685204998" watchObservedRunningTime="2025-10-02 13:58:09.142996349 +0000 UTC m=+10089.693362713" Oct 02 13:58:09 crc kubenswrapper[4929]: I1002 13:58:09.160562 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" podStartSLOduration=3.908913036 podStartE2EDuration="14.160541915s" podCreationTimestamp="2025-10-02 13:57:55 +0000 UTC" firstStartedPulling="2025-10-02 13:57:57.899020612 +0000 UTC m=+10078.449386976" lastFinishedPulling="2025-10-02 13:58:08.150649481 +0000 UTC m=+10088.701015855" observedRunningTime="2025-10-02 13:58:09.150733552 +0000 UTC m=+10089.701099916" watchObservedRunningTime="2025-10-02 13:58:09.160541915 +0000 UTC m=+10089.710908279" Oct 02 13:58:12 crc kubenswrapper[4929]: I1002 13:58:12.936668 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-4hj44"] Oct 02 13:58:12 crc kubenswrapper[4929]: I1002 13:58:12.938579 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:12 crc kubenswrapper[4929]: I1002 13:58:12.940345 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6j7s"/"default-dockercfg-s4pvb" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.041028 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjvs\" (UniqueName: \"kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.041246 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.143093 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.143249 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.143286 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdjvs\" (UniqueName: \"kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.167872 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdjvs\" (UniqueName: \"kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs\") pod \"crc-debug-4hj44\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: I1002 13:58:13.265445 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 13:58:13 crc kubenswrapper[4929]: W1002 13:58:13.301905 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a5e8da_6cfb_4da4_9e43_ef5aac9e9818.slice/crio-f20e8f687bd4a4b54806d32b28641f78a36763d48746ff6e04ff40034dea49df WatchSource:0}: Error finding container f20e8f687bd4a4b54806d32b28641f78a36763d48746ff6e04ff40034dea49df: Status 404 returned error can't find the container with id f20e8f687bd4a4b54806d32b28641f78a36763d48746ff6e04ff40034dea49df Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.175656 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" event={"ID":"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818","Type":"ContainerStarted","Data":"f20e8f687bd4a4b54806d32b28641f78a36763d48746ff6e04ff40034dea49df"} Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.737104 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.737199 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.737280 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.738528 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:58:14 crc kubenswrapper[4929]: I1002 13:58:14.738592 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d" gracePeriod=600 Oct 02 13:58:15 crc kubenswrapper[4929]: I1002 13:58:15.192036 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d" exitCode=0 Oct 02 13:58:15 crc kubenswrapper[4929]: I1002 13:58:15.192092 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d"} Oct 02 13:58:15 crc kubenswrapper[4929]: I1002 13:58:15.192366 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17"} Oct 02 13:58:15 crc kubenswrapper[4929]: I1002 13:58:15.192389 4929 scope.go:117] "RemoveContainer" containerID="6f6e907f5f3b65d8bbd67b0070c60f4833ebd82fe005a18fb512cf73949f2c47" Oct 02 13:58:17 crc kubenswrapper[4929]: I1002 13:58:17.241102 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:17 crc kubenswrapper[4929]: I1002 13:58:17.241446 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:17 crc kubenswrapper[4929]: I1002 13:58:17.317902 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:18 crc kubenswrapper[4929]: I1002 13:58:18.305187 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:18 crc kubenswrapper[4929]: I1002 13:58:18.364895 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:58:20 crc kubenswrapper[4929]: I1002 13:58:20.257569 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dh6th" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="registry-server" containerID="cri-o://b4ceff1fdb1d74cd008671b7dd784aea63073340f8a8c93f881db1e8eae83887" gracePeriod=2 Oct 02 13:58:21 crc kubenswrapper[4929]: I1002 13:58:21.277911 4929 generic.go:334] "Generic (PLEG): container finished" podID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerID="b4ceff1fdb1d74cd008671b7dd784aea63073340f8a8c93f881db1e8eae83887" exitCode=0 Oct 02 13:58:21 crc kubenswrapper[4929]: I1002 13:58:21.277983 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerDied","Data":"b4ceff1fdb1d74cd008671b7dd784aea63073340f8a8c93f881db1e8eae83887"} Oct 02 13:58:26 crc kubenswrapper[4929]: I1002 13:58:26.907593 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.078952 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content\") pod \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.079857 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btxqb\" (UniqueName: \"kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb\") pod \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.080278 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities\") pod \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\" (UID: \"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3\") " Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.081158 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities" (OuterVolumeSpecName: "utilities") pod "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" (UID: "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.082563 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.088230 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb" (OuterVolumeSpecName: "kube-api-access-btxqb") pod "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" (UID: "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3"). InnerVolumeSpecName "kube-api-access-btxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.135900 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" (UID: "0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.184467 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.185195 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btxqb\" (UniqueName: \"kubernetes.io/projected/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3-kube-api-access-btxqb\") on node \"crc\" DevicePath \"\"" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.358213 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dh6th" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.358213 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dh6th" event={"ID":"0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3","Type":"ContainerDied","Data":"7279892e116c7d6ebc8f4e3ea7f5c2362609c10caccd8ab91536d9d5f33ce422"} Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.358638 4929 scope.go:117] "RemoveContainer" containerID="b4ceff1fdb1d74cd008671b7dd784aea63073340f8a8c93f881db1e8eae83887" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.360917 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" event={"ID":"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818","Type":"ContainerStarted","Data":"07e052b2ad4383bb7d87df0d651785c233201813e0af2568a5a62a1d468fcd76"} Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.434110 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" podStartSLOduration=2.340550737 podStartE2EDuration="15.434085596s" podCreationTimestamp="2025-10-02 13:58:12 +0000 UTC" firstStartedPulling="2025-10-02 13:58:13.306048973 +0000 UTC m=+10093.856415337" lastFinishedPulling="2025-10-02 13:58:26.399583832 +0000 UTC m=+10106.949950196" observedRunningTime="2025-10-02 13:58:27.381485719 +0000 UTC m=+10107.931852093" watchObservedRunningTime="2025-10-02 13:58:27.434085596 +0000 UTC m=+10107.984451960" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.452609 4929 scope.go:117] "RemoveContainer" containerID="4dcdf5fd655fbe260a2ebeefb95488687a9368cfdbb122d4bbc989a52f6ae2cd" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.471333 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.496449 4929 scope.go:117] "RemoveContainer" containerID="e1f584b9441ac0dc4187c1a7d3a36b247f03db412664ab5ac1d8f681c302a47c" Oct 02 13:58:27 crc kubenswrapper[4929]: I1002 13:58:27.498282 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dh6th"] Oct 02 13:58:27 crc kubenswrapper[4929]: E1002 13:58:27.596243 4929 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b59ffc9_2256_4b82_a4cb_7eff70ad1ac3.slice\": RecentStats: unable to find data in memory cache]" Oct 02 13:58:28 crc kubenswrapper[4929]: I1002 13:58:28.173562 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" path="/var/lib/kubelet/pods/0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3/volumes" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.174811 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:58:56 crc kubenswrapper[4929]: E1002 13:58:56.176140 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="extract-utilities" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.176159 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="extract-utilities" Oct 02 13:58:56 crc kubenswrapper[4929]: E1002 13:58:56.176190 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="extract-content" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.176199 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="extract-content" Oct 02 13:58:56 crc kubenswrapper[4929]: E1002 13:58:56.176236 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="registry-server" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.176244 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="registry-server" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.177362 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b59ffc9-2256-4b82-a4cb-7eff70ad1ac3" containerName="registry-server" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.183856 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.192605 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.249622 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ggl\" (UniqueName: \"kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.249724 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.249754 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.350710 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ggl\" (UniqueName: \"kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.350795 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.350821 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.351344 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.351446 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.385068 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ggl\" (UniqueName: \"kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl\") pod \"certified-operators-r8jds\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:56 crc kubenswrapper[4929]: I1002 13:58:56.511378 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:58:57 crc kubenswrapper[4929]: I1002 13:58:57.149997 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:58:57 crc kubenswrapper[4929]: I1002 13:58:57.770126 4929 generic.go:334] "Generic (PLEG): container finished" podID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerID="a84e90dc6255a891421cad7bec644a4842d79726b867722f3ac3d0afd29f8a11" exitCode=0 Oct 02 13:58:57 crc kubenswrapper[4929]: I1002 13:58:57.770418 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerDied","Data":"a84e90dc6255a891421cad7bec644a4842d79726b867722f3ac3d0afd29f8a11"} Oct 02 13:58:57 crc kubenswrapper[4929]: I1002 13:58:57.770451 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerStarted","Data":"fcfb071f2e9978b29018f2ce7702bda4ae90c90092e962d67fe4d760c0e37e13"} Oct 02 13:58:59 crc kubenswrapper[4929]: I1002 13:58:59.793133 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerStarted","Data":"2ee0887fe2aa71f08af2765ee66bb5f018fee299eca7b30ffb3e7a5d5e3a98ed"} Oct 02 13:59:04 crc kubenswrapper[4929]: I1002 13:59:04.843785 4929 generic.go:334] "Generic (PLEG): container finished" podID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerID="2ee0887fe2aa71f08af2765ee66bb5f018fee299eca7b30ffb3e7a5d5e3a98ed" exitCode=0 Oct 02 13:59:04 crc kubenswrapper[4929]: I1002 13:59:04.843868 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerDied","Data":"2ee0887fe2aa71f08af2765ee66bb5f018fee299eca7b30ffb3e7a5d5e3a98ed"} Oct 02 13:59:07 crc kubenswrapper[4929]: I1002 13:59:07.899227 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerStarted","Data":"21fb16a70a0a9f23d9a30b500cf786976c3cf639f034fce3a1d2fbcbfec4b507"} Oct 02 13:59:07 crc kubenswrapper[4929]: I1002 13:59:07.924219 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8jds" podStartSLOduration=4.1753507 podStartE2EDuration="11.924191144s" podCreationTimestamp="2025-10-02 13:58:56 +0000 UTC" firstStartedPulling="2025-10-02 13:58:57.772823507 +0000 UTC m=+10138.323189871" lastFinishedPulling="2025-10-02 13:59:05.521663951 +0000 UTC m=+10146.072030315" observedRunningTime="2025-10-02 13:59:07.917795359 +0000 UTC m=+10148.468161743" watchObservedRunningTime="2025-10-02 13:59:07.924191144 +0000 UTC m=+10148.474557508" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.609263 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.615563 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.625290 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.690071 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.690200 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9862\" (UniqueName: \"kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.690282 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.792282 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.792384 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9862\" (UniqueName: \"kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.792455 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.792898 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.793115 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.817296 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9862\" (UniqueName: \"kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862\") pod \"redhat-marketplace-jzlqd\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:13 crc kubenswrapper[4929]: I1002 13:59:13.943360 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:14 crc kubenswrapper[4929]: I1002 13:59:14.720497 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:14 crc kubenswrapper[4929]: W1002 13:59:14.742566 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628328e4_e417_420c_a3c7_1d898f03f6ec.slice/crio-9011352686f2387488239156ee07ab85734550e43b9a34003094b4572042e4d6 WatchSource:0}: Error finding container 9011352686f2387488239156ee07ab85734550e43b9a34003094b4572042e4d6: Status 404 returned error can't find the container with id 9011352686f2387488239156ee07ab85734550e43b9a34003094b4572042e4d6 Oct 02 13:59:14 crc kubenswrapper[4929]: I1002 13:59:14.972568 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerStarted","Data":"9011352686f2387488239156ee07ab85734550e43b9a34003094b4572042e4d6"} Oct 02 13:59:15 crc kubenswrapper[4929]: I1002 13:59:15.987740 4929 generic.go:334] "Generic (PLEG): container finished" podID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerID="60a7db5a19f17470604f36e5ec9cc03313d0c3db89e3e1d4bfebfb4397bfbcb1" exitCode=0 Oct 02 13:59:15 crc kubenswrapper[4929]: I1002 13:59:15.987870 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerDied","Data":"60a7db5a19f17470604f36e5ec9cc03313d0c3db89e3e1d4bfebfb4397bfbcb1"} Oct 02 13:59:16 crc kubenswrapper[4929]: I1002 13:59:16.515167 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:16 crc kubenswrapper[4929]: I1002 13:59:16.515762 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:16 crc kubenswrapper[4929]: I1002 13:59:16.566075 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:17 crc kubenswrapper[4929]: I1002 13:59:17.060662 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:19 crc kubenswrapper[4929]: I1002 13:59:19.407729 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:59:19 crc kubenswrapper[4929]: I1002 13:59:19.408527 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8jds" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="registry-server" containerID="cri-o://21fb16a70a0a9f23d9a30b500cf786976c3cf639f034fce3a1d2fbcbfec4b507" gracePeriod=2 Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.038002 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerStarted","Data":"e76badf0107b41509870e1ad2a778f3288a407580dd5efebbf5584856346f7b7"} Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.043182 4929 generic.go:334] "Generic (PLEG): container finished" podID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerID="21fb16a70a0a9f23d9a30b500cf786976c3cf639f034fce3a1d2fbcbfec4b507" exitCode=0 Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.043261 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerDied","Data":"21fb16a70a0a9f23d9a30b500cf786976c3cf639f034fce3a1d2fbcbfec4b507"} Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.587241 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.774396 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities\") pod \"efb5d6a7-39b6-4bda-8da1-623951efdfba\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.774678 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7ggl\" (UniqueName: \"kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl\") pod \"efb5d6a7-39b6-4bda-8da1-623951efdfba\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.774909 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content\") pod \"efb5d6a7-39b6-4bda-8da1-623951efdfba\" (UID: \"efb5d6a7-39b6-4bda-8da1-623951efdfba\") " Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.775337 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities" (OuterVolumeSpecName: "utilities") pod "efb5d6a7-39b6-4bda-8da1-623951efdfba" (UID: "efb5d6a7-39b6-4bda-8da1-623951efdfba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.776098 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.787270 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl" (OuterVolumeSpecName: "kube-api-access-k7ggl") pod "efb5d6a7-39b6-4bda-8da1-623951efdfba" (UID: "efb5d6a7-39b6-4bda-8da1-623951efdfba"). InnerVolumeSpecName "kube-api-access-k7ggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.852564 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb5d6a7-39b6-4bda-8da1-623951efdfba" (UID: "efb5d6a7-39b6-4bda-8da1-623951efdfba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.878419 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7ggl\" (UniqueName: \"kubernetes.io/projected/efb5d6a7-39b6-4bda-8da1-623951efdfba-kube-api-access-k7ggl\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:20 crc kubenswrapper[4929]: I1002 13:59:20.878734 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb5d6a7-39b6-4bda-8da1-623951efdfba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.059686 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jds" event={"ID":"efb5d6a7-39b6-4bda-8da1-623951efdfba","Type":"ContainerDied","Data":"fcfb071f2e9978b29018f2ce7702bda4ae90c90092e962d67fe4d760c0e37e13"} Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.060078 4929 scope.go:117] "RemoveContainer" containerID="21fb16a70a0a9f23d9a30b500cf786976c3cf639f034fce3a1d2fbcbfec4b507" Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.059688 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jds" Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.067319 4929 generic.go:334] "Generic (PLEG): container finished" podID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerID="e76badf0107b41509870e1ad2a778f3288a407580dd5efebbf5584856346f7b7" exitCode=0 Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.067360 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerDied","Data":"e76badf0107b41509870e1ad2a778f3288a407580dd5efebbf5584856346f7b7"} Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.124090 4929 scope.go:117] "RemoveContainer" containerID="2ee0887fe2aa71f08af2765ee66bb5f018fee299eca7b30ffb3e7a5d5e3a98ed" Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.146366 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.229780 4929 scope.go:117] "RemoveContainer" containerID="a84e90dc6255a891421cad7bec644a4842d79726b867722f3ac3d0afd29f8a11" Oct 02 13:59:21 crc kubenswrapper[4929]: I1002 13:59:21.238288 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8jds"] Oct 02 13:59:22 crc kubenswrapper[4929]: I1002 13:59:22.113467 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerStarted","Data":"44b14602041360edae48fd7ed4aae1695cd6ddfaafa314c65a05cf5dbda97bd1"} Oct 02 13:59:22 crc kubenswrapper[4929]: I1002 13:59:22.135314 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jzlqd" podStartSLOduration=3.375168844 podStartE2EDuration="9.135292263s" podCreationTimestamp="2025-10-02 13:59:13 +0000 UTC" firstStartedPulling="2025-10-02 13:59:15.990706892 +0000 UTC m=+10156.541073256" lastFinishedPulling="2025-10-02 13:59:21.750830311 +0000 UTC m=+10162.301196675" observedRunningTime="2025-10-02 13:59:22.134075388 +0000 UTC m=+10162.684441752" watchObservedRunningTime="2025-10-02 13:59:22.135292263 +0000 UTC m=+10162.685658637" Oct 02 13:59:22 crc kubenswrapper[4929]: I1002 13:59:22.170170 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" path="/var/lib/kubelet/pods/efb5d6a7-39b6-4bda-8da1-623951efdfba/volumes" Oct 02 13:59:23 crc kubenswrapper[4929]: I1002 13:59:23.944257 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:23 crc kubenswrapper[4929]: I1002 13:59:23.944823 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:24 crc kubenswrapper[4929]: I1002 13:59:24.005655 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:33 crc kubenswrapper[4929]: I1002 13:59:33.993309 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:34 crc kubenswrapper[4929]: I1002 13:59:34.048689 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:34 crc kubenswrapper[4929]: I1002 13:59:34.263854 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jzlqd" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="registry-server" containerID="cri-o://44b14602041360edae48fd7ed4aae1695cd6ddfaafa314c65a05cf5dbda97bd1" gracePeriod=2 Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.293835 4929 generic.go:334] "Generic (PLEG): container finished" podID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerID="44b14602041360edae48fd7ed4aae1695cd6ddfaafa314c65a05cf5dbda97bd1" exitCode=0 Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.293926 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerDied","Data":"44b14602041360edae48fd7ed4aae1695cd6ddfaafa314c65a05cf5dbda97bd1"} Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.484886 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.534413 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9862\" (UniqueName: \"kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862\") pod \"628328e4-e417-420c-a3c7-1d898f03f6ec\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.534677 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content\") pod \"628328e4-e417-420c-a3c7-1d898f03f6ec\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.534878 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities\") pod \"628328e4-e417-420c-a3c7-1d898f03f6ec\" (UID: \"628328e4-e417-420c-a3c7-1d898f03f6ec\") " Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.537364 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities" (OuterVolumeSpecName: "utilities") pod "628328e4-e417-420c-a3c7-1d898f03f6ec" (UID: "628328e4-e417-420c-a3c7-1d898f03f6ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.591177 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862" (OuterVolumeSpecName: "kube-api-access-m9862") pod "628328e4-e417-420c-a3c7-1d898f03f6ec" (UID: "628328e4-e417-420c-a3c7-1d898f03f6ec"). InnerVolumeSpecName "kube-api-access-m9862". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.605689 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "628328e4-e417-420c-a3c7-1d898f03f6ec" (UID: "628328e4-e417-420c-a3c7-1d898f03f6ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.638298 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.638330 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9862\" (UniqueName: \"kubernetes.io/projected/628328e4-e417-420c-a3c7-1d898f03f6ec-kube-api-access-m9862\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:35 crc kubenswrapper[4929]: I1002 13:59:35.638343 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628328e4-e417-420c-a3c7-1d898f03f6ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.307229 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzlqd" event={"ID":"628328e4-e417-420c-a3c7-1d898f03f6ec","Type":"ContainerDied","Data":"9011352686f2387488239156ee07ab85734550e43b9a34003094b4572042e4d6"} Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.307466 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzlqd" Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.307483 4929 scope.go:117] "RemoveContainer" containerID="44b14602041360edae48fd7ed4aae1695cd6ddfaafa314c65a05cf5dbda97bd1" Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.334828 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.352325 4929 scope.go:117] "RemoveContainer" containerID="e76badf0107b41509870e1ad2a778f3288a407580dd5efebbf5584856346f7b7" Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.354470 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzlqd"] Oct 02 13:59:36 crc kubenswrapper[4929]: I1002 13:59:36.396852 4929 scope.go:117] "RemoveContainer" containerID="60a7db5a19f17470604f36e5ec9cc03313d0c3db89e3e1d4bfebfb4397bfbcb1" Oct 02 13:59:38 crc kubenswrapper[4929]: I1002 13:59:38.171397 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" path="/var/lib/kubelet/pods/628328e4-e417-420c-a3c7-1d898f03f6ec/volumes" Oct 02 13:59:42 crc kubenswrapper[4929]: I1002 13:59:42.830515 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da433ef9-9937-45f5-b5ef-121eed623b2f/init-config-reloader/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.108910 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da433ef9-9937-45f5-b5ef-121eed623b2f/init-config-reloader/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.187595 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da433ef9-9937-45f5-b5ef-121eed623b2f/alertmanager/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.347261 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_da433ef9-9937-45f5-b5ef-121eed623b2f/config-reloader/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.552065 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_efd3c5dd-604f-436d-8c94-abec6c600e77/aodh-api/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.698828 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_efd3c5dd-604f-436d-8c94-abec6c600e77/aodh-evaluator/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.802656 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_efd3c5dd-604f-436d-8c94-abec6c600e77/aodh-listener/0.log" Oct 02 13:59:43 crc kubenswrapper[4929]: I1002 13:59:43.927123 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_efd3c5dd-604f-436d-8c94-abec6c600e77/aodh-notifier/0.log" Oct 02 13:59:44 crc kubenswrapper[4929]: I1002 13:59:44.206755 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c58fcb746-wqbl8_e600388e-615b-43b0-a87f-0a6e4dc68ce9/barbican-api/0.log" Oct 02 13:59:44 crc kubenswrapper[4929]: I1002 13:59:44.340639 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c58fcb746-wqbl8_e600388e-615b-43b0-a87f-0a6e4dc68ce9/barbican-api-log/0.log" Oct 02 13:59:44 crc kubenswrapper[4929]: I1002 13:59:44.743865 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6698dc494b-48tlq_8383e528-282b-4570-9f27-1d9ebdc46908/barbican-keystone-listener/0.log" Oct 02 13:59:44 crc kubenswrapper[4929]: I1002 13:59:44.942440 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6698dc494b-48tlq_8383e528-282b-4570-9f27-1d9ebdc46908/barbican-keystone-listener-log/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.023281 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74bf4ccf5-t64jz_eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212/barbican-worker/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.181080 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74bf4ccf5-t64jz_eddf7ffe-33c7-4e45-9fd3-c6dfe7e8b212/barbican-worker-log/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.322757 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-6pmtz_e3cdc9c0-5df2-4454-a0d3-4e041bca04e8/bootstrap-openstack-openstack-cell1/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.553707 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2009a546-483b-41a2-97ef-2ceb4c8356e6/ceilometer-central-agent/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.581862 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2009a546-483b-41a2-97ef-2ceb4c8356e6/ceilometer-notification-agent/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.700812 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2009a546-483b-41a2-97ef-2ceb4c8356e6/proxy-httpd/0.log" Oct 02 13:59:45 crc kubenswrapper[4929]: I1002 13:59:45.776289 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2009a546-483b-41a2-97ef-2ceb4c8356e6/sg-core/0.log" Oct 02 13:59:46 crc kubenswrapper[4929]: I1002 13:59:46.263693 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-z8dbl_6f793d5a-649b-4ef6-9935-deabf8dcd0c8/ceph-client-openstack-openstack-cell1/0.log" Oct 02 13:59:46 crc kubenswrapper[4929]: I1002 13:59:46.532849 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_85ff883b-f424-4808-bc97-530931bc3025/cinder-api/0.log" Oct 02 13:59:46 crc kubenswrapper[4929]: I1002 13:59:46.575299 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_85ff883b-f424-4808-bc97-530931bc3025/cinder-api-log/0.log" Oct 02 13:59:46 crc kubenswrapper[4929]: I1002 13:59:46.866985 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e3514ec2-d436-4bdb-8d89-ffb37157ac2d/cinder-backup/0.log" Oct 02 13:59:46 crc kubenswrapper[4929]: I1002 13:59:46.885710 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_e3514ec2-d436-4bdb-8d89-ffb37157ac2d/probe/0.log" Oct 02 13:59:47 crc kubenswrapper[4929]: I1002 13:59:47.090308 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_692a38b4-b060-4289-90b5-224e45ed83ca/cinder-scheduler/0.log" Oct 02 13:59:47 crc kubenswrapper[4929]: I1002 13:59:47.111788 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_692a38b4-b060-4289-90b5-224e45ed83ca/probe/0.log" Oct 02 13:59:47 crc kubenswrapper[4929]: I1002 13:59:47.388491 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_682cda70-91b9-44fc-b6e6-364a397c6de8/cinder-volume/0.log" Oct 02 13:59:47 crc kubenswrapper[4929]: I1002 13:59:47.481004 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_682cda70-91b9-44fc-b6e6-364a397c6de8/probe/0.log" Oct 02 13:59:47 crc kubenswrapper[4929]: I1002 13:59:47.900275 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-cz59f_b58c4837-52b8-431e-b20a-1ab2fd041640/configure-network-openstack-openstack-cell1/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.180804 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-dz8bz_5c519b63-6b84-40a0-a2fa-d28e907d8c5d/configure-os-openstack-openstack-cell1/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.329078 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79c74f8487-xpcth_b0f16e97-68b0-4528-a176-ba1c3a8e95cb/init/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.496324 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79c74f8487-xpcth_b0f16e97-68b0-4528-a176-ba1c3a8e95cb/init/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.556999 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79c74f8487-xpcth_b0f16e97-68b0-4528-a176-ba1c3a8e95cb/dnsmasq-dns/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.741583 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-dvjbj_c748b552-6dd6-4df2-934c-651c8c00add9/download-cache-openstack-openstack-cell1/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.820856 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f2d786e6-cfcd-4c6b-98b8-dade07f516d5/glance-httpd/0.log" Oct 02 13:59:48 crc kubenswrapper[4929]: I1002 13:59:48.930529 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f2d786e6-cfcd-4c6b-98b8-dade07f516d5/glance-log/0.log" Oct 02 13:59:49 crc kubenswrapper[4929]: I1002 13:59:49.033081 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e17b33c-1789-4c39-b643-484d5bcb4f72/glance-httpd/0.log" Oct 02 13:59:49 crc kubenswrapper[4929]: I1002 13:59:49.126054 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e17b33c-1789-4c39-b643-484d5bcb4f72/glance-log/0.log" Oct 02 13:59:49 crc kubenswrapper[4929]: I1002 13:59:49.579642 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-d4c9d7757-cmsx6_f215210a-1592-4fc1-9dee-b9b0a3f6ed80/heat-api/0.log" Oct 02 13:59:49 crc kubenswrapper[4929]: I1002 13:59:49.844646 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-d5d869965-b6fjv_5e0cc909-9c71-492c-9f3a-6bc2acdb8d31/heat-cfnapi/0.log" Oct 02 13:59:49 crc kubenswrapper[4929]: I1002 13:59:49.877788 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6dd94f695d-82pw4_2b95af5b-0003-4ffb-9c24-796e635a2252/heat-engine/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.212732 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-64cf64777-tqdzr_bfdf4e39-cc15-45f7-a15a-4b99136c1e6d/horizon/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.262903 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-64cf64777-tqdzr_bfdf4e39-cc15-45f7-a15a-4b99136c1e6d/horizon-log/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.436381 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-bwpjd_3c0fbe02-afa8-426c-b7dd-bc9dde60e8a2/install-certs-openstack-openstack-cell1/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.604074 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-2wzfc_c48e38fe-d536-4f2b-9d05-4831d3af9490/install-os-openstack-openstack-cell1/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.892279 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323501-f9cmf_ec8358db-3002-4ea6-8bed-c92d946761c4/keystone-cron/0.log" Oct 02 13:59:50 crc kubenswrapper[4929]: I1002 13:59:50.962026 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-548d99c794-gk42m_cdb34ab4-e82e-492e-81cb-0b9dabdf54ae/keystone-api/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.086555 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d79f4d99-93e0-46d9-ac92-9ef18cbc992c/kube-state-metrics/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.386680 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-4jxh7_54ed1de3-f724-4239-8d13-a33ca45c5d4b/libvirt-openstack-openstack-cell1/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.638706 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_20702f9c-bac3-4da2-afc6-59ef9849429f/manila-api-log/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.682561 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_20702f9c-bac3-4da2-afc6-59ef9849429f/manila-api/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.946686 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_067603a9-c42e-433c-ad6b-1522338b1041/probe/0.log" Oct 02 13:59:51 crc kubenswrapper[4929]: I1002 13:59:51.993704 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_067603a9-c42e-433c-ad6b-1522338b1041/manila-scheduler/0.log" Oct 02 13:59:52 crc kubenswrapper[4929]: I1002 13:59:52.169293 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2215b331-4644-4892-93cc-a99728b27cad/manila-share/0.log" Oct 02 13:59:52 crc kubenswrapper[4929]: I1002 13:59:52.259104 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_2215b331-4644-4892-93cc-a99728b27cad/probe/0.log" Oct 02 13:59:52 crc kubenswrapper[4929]: I1002 13:59:52.853302 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5db984bf45-rgjlr_eabfaed5-9966-4922-bad1-f5cf35ab06eb/neutron-httpd/0.log" Oct 02 13:59:52 crc kubenswrapper[4929]: I1002 13:59:52.961148 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5db984bf45-rgjlr_eabfaed5-9966-4922-bad1-f5cf35ab06eb/neutron-api/0.log" Oct 02 13:59:53 crc kubenswrapper[4929]: I1002 13:59:53.364079 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-qhj87_fc0badd8-a9ba-44d5-9ceb-600392fd2c2e/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 02 13:59:53 crc kubenswrapper[4929]: I1002 13:59:53.726531 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-4w5vl_479d3827-fdee-4b7a-b659-6fc9a86f0508/neutron-metadata-openstack-openstack-cell1/0.log" Oct 02 13:59:54 crc kubenswrapper[4929]: I1002 13:59:54.211119 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-gq6bf_e7bef8fe-3f7d-4798-b217-76996aab4a9f/neutron-sriov-openstack-openstack-cell1/0.log" Oct 02 13:59:54 crc kubenswrapper[4929]: I1002 13:59:54.520179 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2dd72841-9f27-48f6-9e2c-1de4e8d50e72/nova-api-api/0.log" Oct 02 13:59:54 crc kubenswrapper[4929]: I1002 13:59:54.648716 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2dd72841-9f27-48f6-9e2c-1de4e8d50e72/nova-api-log/0.log" Oct 02 13:59:55 crc kubenswrapper[4929]: I1002 13:59:55.518872 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_dc2e75da-54a2-4f25-a6f5-151c7a5bc0ac/nova-cell0-conductor-conductor/0.log" Oct 02 13:59:55 crc kubenswrapper[4929]: I1002 13:59:55.525870 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ce6475ed-02e6-4207-aa71-b887b2c53b8d/memcached/0.log" Oct 02 13:59:55 crc kubenswrapper[4929]: I1002 13:59:55.779892 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b2217934-7871-4af0-a947-bcc4c0e7c63d/nova-cell1-conductor-conductor/0.log" Oct 02 13:59:55 crc kubenswrapper[4929]: I1002 13:59:55.967050 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_68107bd7-f800-4101-963b-24050976ae71/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 13:59:56 crc kubenswrapper[4929]: I1002 13:59:56.281237 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellz6b98_2e41a2c6-d1f1-45c5-92a6-f7cf301ad241/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 02 13:59:56 crc kubenswrapper[4929]: I1002 13:59:56.392443 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-8z5c4_0d83ca4f-796e-47b1-a980-7d8f7d7cfb5a/nova-cell1-openstack-openstack-cell1/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.186557 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74203203-c2a9-4137-be9b-cc65ad40f5bf/nova-metadata-log/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.187420 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_74203203-c2a9-4137-be9b-cc65ad40f5bf/nova-metadata-metadata/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.475023 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_900ac3ae-014e-4278-b930-71cdabe0da9d/nova-scheduler-scheduler/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.599217 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5d7d68fcb6-96k42_0d602bfd-b57a-43b4-aaee-d2766cee3ec4/init/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.868874 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5d7d68fcb6-96k42_0d602bfd-b57a-43b4-aaee-d2766cee3ec4/init/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.907732 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5d7d68fcb6-96k42_0d602bfd-b57a-43b4-aaee-d2766cee3ec4/octavia-api-provider-agent/0.log" Oct 02 13:59:57 crc kubenswrapper[4929]: I1002 13:59:57.971534 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-5d7d68fcb6-96k42_0d602bfd-b57a-43b4-aaee-d2766cee3ec4/octavia-api/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.121678 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-7xfvd_ab541ae0-0e64-4fc9-9f78-cb330e890126/init/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.281864 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-7xfvd_ab541ae0-0e64-4fc9-9f78-cb330e890126/init/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.351500 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-7xfvd_ab541ae0-0e64-4fc9-9f78-cb330e890126/octavia-healthmanager/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.709368 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qtk4t_cfdcc6b1-932c-4139-bc67-155977759446/init/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.830061 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qtk4t_cfdcc6b1-932c-4139-bc67-155977759446/init/0.log" Oct 02 13:59:58 crc kubenswrapper[4929]: I1002 13:59:58.885930 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-qtk4t_cfdcc6b1-932c-4139-bc67-155977759446/octavia-housekeeping/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.035269 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-7246t_87fc42ed-8ef3-42be-8dc9-0d8e62925951/init/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.233394 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-7246t_87fc42ed-8ef3-42be-8dc9-0d8e62925951/init/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.279793 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-7246t_87fc42ed-8ef3-42be-8dc9-0d8e62925951/octavia-amphora-httpd/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.440023 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-2nxzx_5cb9a69c-530c-408f-80fe-0027e366683f/init/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.614393 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-2nxzx_5cb9a69c-530c-408f-80fe-0027e366683f/init/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.676568 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-2nxzx_5cb9a69c-530c-408f-80fe-0027e366683f/octavia-rsyslog/0.log" Oct 02 13:59:59 crc kubenswrapper[4929]: I1002 13:59:59.829895 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8bpgx_3580a8ac-ecc4-46d6-8e76-4c49e5341380/init/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.102574 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8bpgx_3580a8ac-ecc4-46d6-8e76-4c49e5341380/init/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.134256 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8bpgx_3580a8ac-ecc4-46d6-8e76-4c49e5341380/octavia-worker/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.182926 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d"] Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183298 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="extract-utilities" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183314 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="extract-utilities" Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183342 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="extract-utilities" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183349 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="extract-utilities" Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183395 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183401 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183412 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="extract-content" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183419 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="extract-content" Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183433 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="extract-content" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183441 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="extract-content" Oct 02 14:00:00 crc kubenswrapper[4929]: E1002 14:00:00.183473 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183482 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183699 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb5d6a7-39b6-4bda-8da1-623951efdfba" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.183715 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="628328e4-e417-420c-a3c7-1d898f03f6ec" containerName="registry-server" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.184535 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.190221 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.190413 4929 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.192850 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d"] Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.298594 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.298662 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4j2r\" (UniqueName: \"kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.298761 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.312025 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21883b8f-1b4a-4eb8-9f9f-48047745f86f/mysql-bootstrap/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.400814 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.400971 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.401020 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4j2r\" (UniqueName: \"kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.401890 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.416135 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4j2r\" (UniqueName: \"kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.416835 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume\") pod \"collect-profiles-29323560-2945d\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.522568 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.532841 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21883b8f-1b4a-4eb8-9f9f-48047745f86f/galera/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.593435 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_21883b8f-1b4a-4eb8-9f9f-48047745f86f/mysql-bootstrap/0.log" Oct 02 14:00:00 crc kubenswrapper[4929]: I1002 14:00:00.802807 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f4a722cb-85e5-4bcd-8acd-101708d08d0e/mysql-bootstrap/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.015248 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f4a722cb-85e5-4bcd-8acd-101708d08d0e/mysql-bootstrap/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.043583 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f4a722cb-85e5-4bcd-8acd-101708d08d0e/galera/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.181649 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d"] Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.299579 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6c22dd30-f774-4d88-8e74-70a1dac9c474/openstackclient/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.393896 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-w7jz9_cca7d8a2-15be-44ac-9bb5-4aad5973a40d/openstack-network-exporter/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.575746 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" event={"ID":"a44084bc-2eee-4a68-9d71-018fb553e71b","Type":"ContainerStarted","Data":"c2433a3a6575ac4d0ba7b7fdef8edf4045a427a24fa24c13341b8123bf883b6d"} Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.576096 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" event={"ID":"a44084bc-2eee-4a68-9d71-018fb553e71b","Type":"ContainerStarted","Data":"1b9a4ee12d1497f4ccef21cfbfe2172576b3d3b6f25d13e50ca6105c8dfe7ce8"} Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.598777 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" podStartSLOduration=1.598757291 podStartE2EDuration="1.598757291s" podCreationTimestamp="2025-10-02 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 14:00:01.594427776 +0000 UTC m=+10202.144794150" watchObservedRunningTime="2025-10-02 14:00:01.598757291 +0000 UTC m=+10202.149123655" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.622784 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hpmws_e15929c9-ed67-44ce-8158-b85635f32121/ovsdb-server-init/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.839537 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hpmws_e15929c9-ed67-44ce-8158-b85635f32121/ovsdb-server-init/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.855574 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hpmws_e15929c9-ed67-44ce-8158-b85635f32121/ovsdb-server/0.log" Oct 02 14:00:01 crc kubenswrapper[4929]: I1002 14:00:01.896861 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hpmws_e15929c9-ed67-44ce-8158-b85635f32121/ovs-vswitchd/0.log" Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.193727 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sc5zm_1ee7c592-3942-47da-9be7-e146a4768544/ovn-controller/0.log" Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.395643 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f6a797f-1ea1-430f-8898-c10fa43d597f/ovn-northd/0.log" Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.417402 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4f6a797f-1ea1-430f-8898-c10fa43d597f/openstack-network-exporter/0.log" Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.588354 4929 generic.go:334] "Generic (PLEG): container finished" podID="a44084bc-2eee-4a68-9d71-018fb553e71b" containerID="c2433a3a6575ac4d0ba7b7fdef8edf4045a427a24fa24c13341b8123bf883b6d" exitCode=0 Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.588400 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" event={"ID":"a44084bc-2eee-4a68-9d71-018fb553e71b","Type":"ContainerDied","Data":"c2433a3a6575ac4d0ba7b7fdef8edf4045a427a24fa24c13341b8123bf883b6d"} Oct 02 14:00:02 crc kubenswrapper[4929]: I1002 14:00:02.786831 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-gvqsj_d61ace99-a0d0-431c-989a-586bdff5c0de/ovn-openstack-openstack-cell1/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.121732 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8859d23-8436-449b-874a-722a6bc44f5c/ovsdbserver-nb/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.137378 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8859d23-8436-449b-874a-722a6bc44f5c/openstack-network-exporter/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.331985 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f/openstack-network-exporter/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.367871 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_4ab1b12f-93dd-4a7b-9f77-ebbd7b53f32f/ovsdbserver-nb/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.534049 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_2f945dd1-560f-4013-b22c-bb99be50b29d/openstack-network-exporter/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.585863 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_2f945dd1-560f-4013-b22c-bb99be50b29d/ovsdbserver-nb/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.835872 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bbb37575-cc7e-4f62-ab42-32c31388ed7d/openstack-network-exporter/0.log" Oct 02 14:00:03 crc kubenswrapper[4929]: I1002 14:00:03.839667 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bbb37575-cc7e-4f62-ab42-32c31388ed7d/ovsdbserver-sb/0.log" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.609543 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" event={"ID":"a44084bc-2eee-4a68-9d71-018fb553e71b","Type":"ContainerDied","Data":"1b9a4ee12d1497f4ccef21cfbfe2172576b3d3b6f25d13e50ca6105c8dfe7ce8"} Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.609814 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b9a4ee12d1497f4ccef21cfbfe2172576b3d3b6f25d13e50ca6105c8dfe7ce8" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.609553 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.700919 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4j2r\" (UniqueName: \"kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r\") pod \"a44084bc-2eee-4a68-9d71-018fb553e71b\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.701044 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume\") pod \"a44084bc-2eee-4a68-9d71-018fb553e71b\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.701308 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume\") pod \"a44084bc-2eee-4a68-9d71-018fb553e71b\" (UID: \"a44084bc-2eee-4a68-9d71-018fb553e71b\") " Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.704573 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume" (OuterVolumeSpecName: "config-volume") pod "a44084bc-2eee-4a68-9d71-018fb553e71b" (UID: "a44084bc-2eee-4a68-9d71-018fb553e71b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.717466 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r" (OuterVolumeSpecName: "kube-api-access-d4j2r") pod "a44084bc-2eee-4a68-9d71-018fb553e71b" (UID: "a44084bc-2eee-4a68-9d71-018fb553e71b"). InnerVolumeSpecName "kube-api-access-d4j2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.727181 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a44084bc-2eee-4a68-9d71-018fb553e71b" (UID: "a44084bc-2eee-4a68-9d71-018fb553e71b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.751273 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c37826e0-0523-4e9c-a506-afc38262b985/openstack-network-exporter/0.log" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.795372 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c37826e0-0523-4e9c-a506-afc38262b985/ovsdbserver-sb/0.log" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.803910 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4j2r\" (UniqueName: \"kubernetes.io/projected/a44084bc-2eee-4a68-9d71-018fb553e71b-kube-api-access-d4j2r\") on node \"crc\" DevicePath \"\"" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.803973 4929 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a44084bc-2eee-4a68-9d71-018fb553e71b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.803986 4929 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a44084bc-2eee-4a68-9d71-018fb553e71b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 14:00:04 crc kubenswrapper[4929]: I1002 14:00:04.987339 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_88692526-80f8-4a95-93d6-a6920288ddbf/openstack-network-exporter/0.log" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.043196 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_88692526-80f8-4a95-93d6-a6920288ddbf/ovsdbserver-sb/0.log" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.301450 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f684d7d76-vbzgj_f87c45ef-7b90-4dde-ada1-820233761ec1/placement-api/0.log" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.340136 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6f684d7d76-vbzgj_f87c45ef-7b90-4dde-ada1-820233761ec1/placement-log/0.log" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.495028 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c2scbj_f83da3fa-a3f5-4a4e-a3b0-a8908d72a027/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.663465 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323560-2945d" Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.701847 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69"] Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.712816 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-jgr69"] Oct 02 14:00:05 crc kubenswrapper[4929]: I1002 14:00:05.735498 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8fceab33-2fbd-4885-8d95-87f1a28c9c65/init-config-reloader/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.171253 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cb8ab5-a3d3-45d7-a1e1-4809e4986e81" path="/var/lib/kubelet/pods/87cb8ab5-a3d3-45d7-a1e1-4809e4986e81/volumes" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.480122 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8fceab33-2fbd-4885-8d95-87f1a28c9c65/init-config-reloader/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.489369 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8fceab33-2fbd-4885-8d95-87f1a28c9c65/config-reloader/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.507950 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8fceab33-2fbd-4885-8d95-87f1a28c9c65/prometheus/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.696507 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a43f80b5-2330-4ade-9fe5-80bcb95a36b5/setup-container/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.725209 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8fceab33-2fbd-4885-8d95-87f1a28c9c65/thanos-sidecar/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.911941 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a43f80b5-2330-4ade-9fe5-80bcb95a36b5/setup-container/0.log" Oct 02 14:00:06 crc kubenswrapper[4929]: I1002 14:00:06.949233 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a43f80b5-2330-4ade-9fe5-80bcb95a36b5/rabbitmq/0.log" Oct 02 14:00:07 crc kubenswrapper[4929]: I1002 14:00:07.065334 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3/setup-container/0.log" Oct 02 14:00:07 crc kubenswrapper[4929]: I1002 14:00:07.487291 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3/setup-container/0.log" Oct 02 14:00:07 crc kubenswrapper[4929]: I1002 14:00:07.721798 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-fgfkl_3f44e6b7-fafd-4b4c-9fa5-f35252b9e1e3/reboot-os-openstack-openstack-cell1/0.log" Oct 02 14:00:07 crc kubenswrapper[4929]: I1002 14:00:07.963327 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-xmnfg_3491a7e6-e824-4a76-b1ee-58e43e17cbe5/run-os-openstack-openstack-cell1/0.log" Oct 02 14:00:08 crc kubenswrapper[4929]: I1002 14:00:08.214132 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-vzp5g_a4d3a2ce-baba-4cc5-84ec-50b29f72fc31/ssh-known-hosts-openstack/0.log" Oct 02 14:00:08 crc kubenswrapper[4929]: I1002 14:00:08.470317 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-m9k6m_d19e27bb-e8ff-410d-a0cb-be48e389a20c/telemetry-openstack-openstack-cell1/0.log" Oct 02 14:00:08 crc kubenswrapper[4929]: I1002 14:00:08.554118 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f5ea066c-e1c6-43a4-9e4e-2783a95b8ba3/rabbitmq/0.log" Oct 02 14:00:08 crc kubenswrapper[4929]: I1002 14:00:08.705340 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-kv7qw_4bff17f1-9395-47c4-a706-a0adf7745c50/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 02 14:00:08 crc kubenswrapper[4929]: I1002 14:00:08.876716 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-ks8s7_724f6796-c8d6-4bf3-9623-47da1ed4754f/validate-network-openstack-openstack-cell1/0.log" Oct 02 14:00:29 crc kubenswrapper[4929]: I1002 14:00:29.896224 4929 scope.go:117] "RemoveContainer" containerID="1d931c3e944cafdffec2ef877bf193895a4390646d34156932bfd84099bf6fc6" Oct 02 14:00:44 crc kubenswrapper[4929]: I1002 14:00:44.736749 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 14:00:44 crc kubenswrapper[4929]: I1002 14:00:44.737447 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.183349 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323561-7qwrz"] Oct 02 14:01:00 crc kubenswrapper[4929]: E1002 14:01:00.184494 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44084bc-2eee-4a68-9d71-018fb553e71b" containerName="collect-profiles" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.184516 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44084bc-2eee-4a68-9d71-018fb553e71b" containerName="collect-profiles" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.185069 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44084bc-2eee-4a68-9d71-018fb553e71b" containerName="collect-profiles" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.215846 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.228333 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323561-7qwrz"] Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.341222 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7669n\" (UniqueName: \"kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.341479 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.341737 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.341847 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.445032 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.445175 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.445252 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7669n\" (UniqueName: \"kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.445307 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.460201 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.463561 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.465872 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7669n\" (UniqueName: \"kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.473192 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys\") pod \"keystone-cron-29323561-7qwrz\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:00 crc kubenswrapper[4929]: I1002 14:01:00.552492 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:01 crc kubenswrapper[4929]: I1002 14:01:01.035553 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323561-7qwrz"] Oct 02 14:01:01 crc kubenswrapper[4929]: I1002 14:01:01.297792 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323561-7qwrz" event={"ID":"7b682abe-201a-48df-b561-147fd6f79b25","Type":"ContainerStarted","Data":"f1b35ac404302c18cf448c8d2fb4d22b1100d4596d5928970fa5495c52c53016"} Oct 02 14:01:01 crc kubenswrapper[4929]: I1002 14:01:01.297876 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323561-7qwrz" event={"ID":"7b682abe-201a-48df-b561-147fd6f79b25","Type":"ContainerStarted","Data":"b692bdfc47ad7f90ccbc95f81ab3aa670899da9e620a8f169f5b596a92860a3b"} Oct 02 14:01:01 crc kubenswrapper[4929]: I1002 14:01:01.321650 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323561-7qwrz" podStartSLOduration=1.321632502 podStartE2EDuration="1.321632502s" podCreationTimestamp="2025-10-02 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 14:01:01.316238176 +0000 UTC m=+10261.866604560" watchObservedRunningTime="2025-10-02 14:01:01.321632502 +0000 UTC m=+10261.871998866" Oct 02 14:01:04 crc kubenswrapper[4929]: I1002 14:01:04.330746 4929 generic.go:334] "Generic (PLEG): container finished" podID="7b682abe-201a-48df-b561-147fd6f79b25" containerID="f1b35ac404302c18cf448c8d2fb4d22b1100d4596d5928970fa5495c52c53016" exitCode=0 Oct 02 14:01:04 crc kubenswrapper[4929]: I1002 14:01:04.330852 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323561-7qwrz" event={"ID":"7b682abe-201a-48df-b561-147fd6f79b25","Type":"ContainerDied","Data":"f1b35ac404302c18cf448c8d2fb4d22b1100d4596d5928970fa5495c52c53016"} Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.819590 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.973852 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys\") pod \"7b682abe-201a-48df-b561-147fd6f79b25\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.975311 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data\") pod \"7b682abe-201a-48df-b561-147fd6f79b25\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.975484 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7669n\" (UniqueName: \"kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n\") pod \"7b682abe-201a-48df-b561-147fd6f79b25\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.975553 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle\") pod \"7b682abe-201a-48df-b561-147fd6f79b25\" (UID: \"7b682abe-201a-48df-b561-147fd6f79b25\") " Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.986198 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7b682abe-201a-48df-b561-147fd6f79b25" (UID: "7b682abe-201a-48df-b561-147fd6f79b25"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 14:01:05 crc kubenswrapper[4929]: I1002 14:01:05.987330 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n" (OuterVolumeSpecName: "kube-api-access-7669n") pod "7b682abe-201a-48df-b561-147fd6f79b25" (UID: "7b682abe-201a-48df-b561-147fd6f79b25"). InnerVolumeSpecName "kube-api-access-7669n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.004228 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b682abe-201a-48df-b561-147fd6f79b25" (UID: "7b682abe-201a-48df-b561-147fd6f79b25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.030276 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data" (OuterVolumeSpecName: "config-data") pod "7b682abe-201a-48df-b561-147fd6f79b25" (UID: "7b682abe-201a-48df-b561-147fd6f79b25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.080472 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7669n\" (UniqueName: \"kubernetes.io/projected/7b682abe-201a-48df-b561-147fd6f79b25-kube-api-access-7669n\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.080503 4929 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.080518 4929 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.080530 4929 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682abe-201a-48df-b561-147fd6f79b25-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.350411 4929 generic.go:334] "Generic (PLEG): container finished" podID="d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" containerID="07e052b2ad4383bb7d87df0d651785c233201813e0af2568a5a62a1d468fcd76" exitCode=0 Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.350507 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" event={"ID":"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818","Type":"ContainerDied","Data":"07e052b2ad4383bb7d87df0d651785c233201813e0af2568a5a62a1d468fcd76"} Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.354545 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323561-7qwrz" event={"ID":"7b682abe-201a-48df-b561-147fd6f79b25","Type":"ContainerDied","Data":"b692bdfc47ad7f90ccbc95f81ab3aa670899da9e620a8f169f5b596a92860a3b"} Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.354594 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323561-7qwrz" Oct 02 14:01:06 crc kubenswrapper[4929]: I1002 14:01:06.354621 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b692bdfc47ad7f90ccbc95f81ab3aa670899da9e620a8f169f5b596a92860a3b" Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.486561 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.524468 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-4hj44"] Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.533102 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-4hj44"] Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.615435 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdjvs\" (UniqueName: \"kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs\") pod \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.615536 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host\") pod \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\" (UID: \"d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818\") " Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.615747 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host" (OuterVolumeSpecName: "host") pod "d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" (UID: "d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.616251 4929 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-host\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.625381 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs" (OuterVolumeSpecName: "kube-api-access-gdjvs") pod "d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" (UID: "d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818"). InnerVolumeSpecName "kube-api-access-gdjvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:01:07 crc kubenswrapper[4929]: I1002 14:01:07.717928 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdjvs\" (UniqueName: \"kubernetes.io/projected/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818-kube-api-access-gdjvs\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.168169 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" path="/var/lib/kubelet/pods/d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818/volumes" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.378498 4929 scope.go:117] "RemoveContainer" containerID="07e052b2ad4383bb7d87df0d651785c233201813e0af2568a5a62a1d468fcd76" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.378579 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-4hj44" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.707107 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-gzldb"] Oct 02 14:01:08 crc kubenswrapper[4929]: E1002 14:01:08.707532 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b682abe-201a-48df-b561-147fd6f79b25" containerName="keystone-cron" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.707543 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b682abe-201a-48df-b561-147fd6f79b25" containerName="keystone-cron" Oct 02 14:01:08 crc kubenswrapper[4929]: E1002 14:01:08.707574 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" containerName="container-00" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.707581 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" containerName="container-00" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.707776 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b682abe-201a-48df-b561-147fd6f79b25" containerName="keystone-cron" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.707802 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a5e8da-6cfb-4da4-9e43-ef5aac9e9818" containerName="container-00" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.708506 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.710353 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6j7s"/"default-dockercfg-s4pvb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.844158 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.844278 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzcn\" (UniqueName: \"kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.947392 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.947465 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzcn\" (UniqueName: \"kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.947548 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:08 crc kubenswrapper[4929]: I1002 14:01:08.968991 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzcn\" (UniqueName: \"kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn\") pod \"crc-debug-gzldb\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:09 crc kubenswrapper[4929]: I1002 14:01:09.026699 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:09 crc kubenswrapper[4929]: I1002 14:01:09.390771 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" event={"ID":"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4","Type":"ContainerStarted","Data":"671acdb51100e6246af929d3fd7cb31df8499249a0989a0a4ba28bf8e60488bb"} Oct 02 14:01:09 crc kubenswrapper[4929]: I1002 14:01:09.391034 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" event={"ID":"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4","Type":"ContainerStarted","Data":"0d167280e22f6f63b0aa5d349e45c30fb2ac1d764efd58fe3bbcbfc599798aac"} Oct 02 14:01:09 crc kubenswrapper[4929]: I1002 14:01:09.405455 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" podStartSLOduration=1.40543273 podStartE2EDuration="1.40543273s" podCreationTimestamp="2025-10-02 14:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 14:01:09.401795805 +0000 UTC m=+10269.952162169" watchObservedRunningTime="2025-10-02 14:01:09.40543273 +0000 UTC m=+10269.955799094" Oct 02 14:01:10 crc kubenswrapper[4929]: I1002 14:01:10.400441 4929 generic.go:334] "Generic (PLEG): container finished" podID="4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" containerID="671acdb51100e6246af929d3fd7cb31df8499249a0989a0a4ba28bf8e60488bb" exitCode=0 Oct 02 14:01:10 crc kubenswrapper[4929]: I1002 14:01:10.400524 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" event={"ID":"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4","Type":"ContainerDied","Data":"671acdb51100e6246af929d3fd7cb31df8499249a0989a0a4ba28bf8e60488bb"} Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.520326 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.698322 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host\") pod \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.698697 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtzcn\" (UniqueName: \"kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn\") pod \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\" (UID: \"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4\") " Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.698619 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host" (OuterVolumeSpecName: "host") pod "4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" (UID: "4ed27a73-efcf-4bf8-93cb-f809a4ed15f4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.699783 4929 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-host\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.728767 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn" (OuterVolumeSpecName: "kube-api-access-wtzcn") pod "4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" (UID: "4ed27a73-efcf-4bf8-93cb-f809a4ed15f4"). InnerVolumeSpecName "kube-api-access-wtzcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:01:11 crc kubenswrapper[4929]: I1002 14:01:11.803671 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtzcn\" (UniqueName: \"kubernetes.io/projected/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4-kube-api-access-wtzcn\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:12 crc kubenswrapper[4929]: I1002 14:01:12.450859 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" event={"ID":"4ed27a73-efcf-4bf8-93cb-f809a4ed15f4","Type":"ContainerDied","Data":"0d167280e22f6f63b0aa5d349e45c30fb2ac1d764efd58fe3bbcbfc599798aac"} Oct 02 14:01:12 crc kubenswrapper[4929]: I1002 14:01:12.451180 4929 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d167280e22f6f63b0aa5d349e45c30fb2ac1d764efd58fe3bbcbfc599798aac" Oct 02 14:01:12 crc kubenswrapper[4929]: I1002 14:01:12.451314 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-gzldb" Oct 02 14:01:14 crc kubenswrapper[4929]: I1002 14:01:14.736346 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 14:01:14 crc kubenswrapper[4929]: I1002 14:01:14.736646 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 14:01:20 crc kubenswrapper[4929]: I1002 14:01:20.127732 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-gzldb"] Oct 02 14:01:20 crc kubenswrapper[4929]: I1002 14:01:20.139951 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-gzldb"] Oct 02 14:01:20 crc kubenswrapper[4929]: I1002 14:01:20.169439 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" path="/var/lib/kubelet/pods/4ed27a73-efcf-4bf8-93cb-f809a4ed15f4/volumes" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.294442 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-6ms8r"] Oct 02 14:01:21 crc kubenswrapper[4929]: E1002 14:01:21.295292 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" containerName="container-00" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.295307 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" containerName="container-00" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.295567 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed27a73-efcf-4bf8-93cb-f809a4ed15f4" containerName="container-00" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.296443 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.299306 4929 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6j7s"/"default-dockercfg-s4pvb" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.394097 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpzw\" (UniqueName: \"kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.394158 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.496231 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztpzw\" (UniqueName: \"kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.496288 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.496534 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.524220 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztpzw\" (UniqueName: \"kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw\") pod \"crc-debug-6ms8r\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: I1002 14:01:21.620681 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:21 crc kubenswrapper[4929]: W1002 14:01:21.657510 4929 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30742474_b02a_43d0_9356_6b41f1af7981.slice/crio-135bed993ce7ad303a72f90c4c70efa6ef78716d4651f72c674c1ff28df9a458 WatchSource:0}: Error finding container 135bed993ce7ad303a72f90c4c70efa6ef78716d4651f72c674c1ff28df9a458: Status 404 returned error can't find the container with id 135bed993ce7ad303a72f90c4c70efa6ef78716d4651f72c674c1ff28df9a458 Oct 02 14:01:22 crc kubenswrapper[4929]: I1002 14:01:22.580891 4929 generic.go:334] "Generic (PLEG): container finished" podID="30742474-b02a-43d0-9356-6b41f1af7981" containerID="f34da049658fc2ce845ce435109cd294af3dc73622eef67f0e551fe95345f6d2" exitCode=0 Oct 02 14:01:22 crc kubenswrapper[4929]: I1002 14:01:22.581196 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" event={"ID":"30742474-b02a-43d0-9356-6b41f1af7981","Type":"ContainerDied","Data":"f34da049658fc2ce845ce435109cd294af3dc73622eef67f0e551fe95345f6d2"} Oct 02 14:01:22 crc kubenswrapper[4929]: I1002 14:01:22.581230 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" event={"ID":"30742474-b02a-43d0-9356-6b41f1af7981","Type":"ContainerStarted","Data":"135bed993ce7ad303a72f90c4c70efa6ef78716d4651f72c674c1ff28df9a458"} Oct 02 14:01:22 crc kubenswrapper[4929]: I1002 14:01:22.629290 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-6ms8r"] Oct 02 14:01:22 crc kubenswrapper[4929]: I1002 14:01:22.640062 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6j7s/crc-debug-6ms8r"] Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.698753 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.846426 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host\") pod \"30742474-b02a-43d0-9356-6b41f1af7981\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.846493 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host" (OuterVolumeSpecName: "host") pod "30742474-b02a-43d0-9356-6b41f1af7981" (UID: "30742474-b02a-43d0-9356-6b41f1af7981"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.846583 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztpzw\" (UniqueName: \"kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw\") pod \"30742474-b02a-43d0-9356-6b41f1af7981\" (UID: \"30742474-b02a-43d0-9356-6b41f1af7981\") " Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.847103 4929 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30742474-b02a-43d0-9356-6b41f1af7981-host\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.852460 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw" (OuterVolumeSpecName: "kube-api-access-ztpzw") pod "30742474-b02a-43d0-9356-6b41f1af7981" (UID: "30742474-b02a-43d0-9356-6b41f1af7981"). InnerVolumeSpecName "kube-api-access-ztpzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:01:23 crc kubenswrapper[4929]: I1002 14:01:23.949253 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztpzw\" (UniqueName: \"kubernetes.io/projected/30742474-b02a-43d0-9356-6b41f1af7981-kube-api-access-ztpzw\") on node \"crc\" DevicePath \"\"" Oct 02 14:01:24 crc kubenswrapper[4929]: I1002 14:01:24.168931 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30742474-b02a-43d0-9356-6b41f1af7981" path="/var/lib/kubelet/pods/30742474-b02a-43d0-9356-6b41f1af7981/volumes" Oct 02 14:01:24 crc kubenswrapper[4929]: I1002 14:01:24.605014 4929 scope.go:117] "RemoveContainer" containerID="f34da049658fc2ce845ce435109cd294af3dc73622eef67f0e551fe95345f6d2" Oct 02 14:01:24 crc kubenswrapper[4929]: I1002 14:01:24.605102 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/crc-debug-6ms8r" Oct 02 14:01:34 crc kubenswrapper[4929]: I1002 14:01:34.832692 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-w4lnt_7fc88030-43e9-4a64-a887-f8db65808659/kube-rbac-proxy/0.log" Oct 02 14:01:34 crc kubenswrapper[4929]: I1002 14:01:34.991108 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-w4lnt_7fc88030-43e9-4a64-a887-f8db65808659/manager/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.138846 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-vbcph_e640cd16-0a71-4f55-a6bd-154d16e32427/kube-rbac-proxy/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.231015 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-jvvpf_dbbebfcb-a8c5-411f-a280-9d225411602e/kube-rbac-proxy/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.295044 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-vbcph_e640cd16-0a71-4f55-a6bd-154d16e32427/manager/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.339991 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-jvvpf_dbbebfcb-a8c5-411f-a280-9d225411602e/manager/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.424905 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/util/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.660657 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/pull/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.674554 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/pull/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.693394 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/util/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.850749 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/util/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.858593 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/pull/0.log" Oct 02 14:01:35 crc kubenswrapper[4929]: I1002 14:01:35.890271 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fb6e38bc10b7a1797889b9a1c02a1d46ffe8d48a8a791b5370851590f1j7dms_a7303a1a-93cc-40f2-b087-f5b6723797f8/extract/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.069054 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-9qljm_bc30a5a5-e775-4c1d-a644-6072c3b0eeea/kube-rbac-proxy/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.119947 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-lpvb9_31731ea3-445c-4949-826f-012b32eb8737/kube-rbac-proxy/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.190726 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-9qljm_bc30a5a5-e775-4c1d-a644-6072c3b0eeea/manager/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.327374 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-kg8nv_8bc7a024-b6bf-46df-a6db-a27e3de0316b/kube-rbac-proxy/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.334530 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-lpvb9_31731ea3-445c-4949-826f-012b32eb8737/manager/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.404233 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-kg8nv_8bc7a024-b6bf-46df-a6db-a27e3de0316b/manager/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.558000 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-s5528_6c83fc23-d663-403f-a7d2-2f2398b6d0a3/kube-rbac-proxy/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.782382 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7q5cl_315d86d0-376d-45a1-8bb1-bdb533e0a3fd/kube-rbac-proxy/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.823476 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-7q5cl_315d86d0-376d-45a1-8bb1-bdb533e0a3fd/manager/0.log" Oct 02 14:01:36 crc kubenswrapper[4929]: I1002 14:01:36.840814 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-s5528_6c83fc23-d663-403f-a7d2-2f2398b6d0a3/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.023375 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-6rxh4_d7bca063-4f2b-4430-bdeb-c018ca23445b/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.144947 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-6rxh4_d7bca063-4f2b-4430-bdeb-c018ca23445b/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.227379 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-qktzg_0f2221a1-54f4-4c60-89b7-1d7759038d5c/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.266614 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-qktzg_0f2221a1-54f4-4c60-89b7-1d7759038d5c/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.362161 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-9kptb_be9c3cb6-dc25-447a-923f-9cfabd7b97f3/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.451219 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-9kptb_be9c3cb6-dc25-447a-923f-9cfabd7b97f3/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.475199 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-52g6g_66bdd8cf-6f68-48c3-af01-1785234e988f/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.611721 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-52g6g_66bdd8cf-6f68-48c3-af01-1785234e988f/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.670947 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jwjmh_310bec17-d196-4dd4-926c-817b053f36cc/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.865574 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-67mmn_c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0/kube-rbac-proxy/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.870226 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jwjmh_310bec17-d196-4dd4-926c-817b053f36cc/manager/0.log" Oct 02 14:01:37 crc kubenswrapper[4929]: I1002 14:01:37.931344 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-67mmn_c87bc3f1-fd85-4e4f-bca1-ac892f48f6f0/manager/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.124828 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ww7r7_b1438f7d-c50e-42ed-b444-8a9d019a886b/kube-rbac-proxy/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.125260 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-ww7r7_b1438f7d-c50e-42ed-b444-8a9d019a886b/manager/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.275498 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c445c66cc-x2c6p_177c2f46-2f18-440c-b119-ee413ae5c4ed/kube-rbac-proxy/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.415785 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f5c85554c-9k8x2_79d4fb9c-c8e4-427a-b7f8-3889ed40703a/kube-rbac-proxy/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.665503 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f5c85554c-9k8x2_79d4fb9c-c8e4-427a-b7f8-3889ed40703a/operator/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.690051 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dd9vd_2298fe8f-5cad-4612-b33b-6941a43c5a63/registry-server/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.799107 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-hk8xv_c87923ae-b391-4bc6-8463-16d2f1c4427b/kube-rbac-proxy/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.918389 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-gc9bk_f2a55e76-361d-4a08-88e9-4b7594fda990/kube-rbac-proxy/0.log" Oct 02 14:01:38 crc kubenswrapper[4929]: I1002 14:01:38.993401 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-hk8xv_c87923ae-b391-4bc6-8463-16d2f1c4427b/manager/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.040293 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-gc9bk_f2a55e76-361d-4a08-88e9-4b7594fda990/manager/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.203633 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-84jbl_a8a1c59c-9589-4343-b7df-1052d533177e/operator/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.415898 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j2xw4_5c208ca0-21fb-4313-b865-7d3219f4f180/kube-rbac-proxy/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.453147 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j2xw4_5c208ca0-21fb-4313-b865-7d3219f4f180/manager/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.503815 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-whrgt_cdadb186-5a8a-4a7c-9dba-9ed850642887/kube-rbac-proxy/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.747665 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-j5mr4_06267520-c8b3-4436-8088-2c42fdd5c1d3/kube-rbac-proxy/0.log" Oct 02 14:01:39 crc kubenswrapper[4929]: I1002 14:01:39.834632 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-j5mr4_06267520-c8b3-4436-8088-2c42fdd5c1d3/manager/0.log" Oct 02 14:01:40 crc kubenswrapper[4929]: I1002 14:01:40.037473 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-9csrz_b2d4b0ec-0488-4157-b187-d63fbc4a932b/kube-rbac-proxy/0.log" Oct 02 14:01:40 crc kubenswrapper[4929]: I1002 14:01:40.086216 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-whrgt_cdadb186-5a8a-4a7c-9dba-9ed850642887/manager/0.log" Oct 02 14:01:40 crc kubenswrapper[4929]: I1002 14:01:40.095195 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-9csrz_b2d4b0ec-0488-4157-b187-d63fbc4a932b/manager/0.log" Oct 02 14:01:40 crc kubenswrapper[4929]: I1002 14:01:40.856069 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c445c66cc-x2c6p_177c2f46-2f18-440c-b119-ee413ae5c4ed/manager/0.log" Oct 02 14:01:44 crc kubenswrapper[4929]: I1002 14:01:44.737210 4929 patch_prober.go:28] interesting pod/machine-config-daemon-8j488 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 14:01:44 crc kubenswrapper[4929]: I1002 14:01:44.737726 4929 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 14:01:44 crc kubenswrapper[4929]: I1002 14:01:44.737776 4929 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8j488" Oct 02 14:01:44 crc kubenswrapper[4929]: I1002 14:01:44.738670 4929 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17"} pod="openshift-machine-config-operator/machine-config-daemon-8j488" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 14:01:44 crc kubenswrapper[4929]: I1002 14:01:44.738736 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" containerName="machine-config-daemon" containerID="cri-o://e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" gracePeriod=600 Oct 02 14:01:44 crc kubenswrapper[4929]: E1002 14:01:44.874382 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:01:45 crc kubenswrapper[4929]: I1002 14:01:45.822820 4929 generic.go:334] "Generic (PLEG): container finished" podID="1b4b5329-0385-4f39-9d63-70284421e448" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" exitCode=0 Oct 02 14:01:45 crc kubenswrapper[4929]: I1002 14:01:45.822908 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerDied","Data":"e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17"} Oct 02 14:01:45 crc kubenswrapper[4929]: I1002 14:01:45.823274 4929 scope.go:117] "RemoveContainer" containerID="7f5e49c3f46a77906712aad83f9348a7f979bacb735fe35ef412f45fbde15d2d" Oct 02 14:01:45 crc kubenswrapper[4929]: I1002 14:01:45.824094 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:01:45 crc kubenswrapper[4929]: E1002 14:01:45.824482 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:01:56 crc kubenswrapper[4929]: I1002 14:01:56.040127 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rch4g_c43f5d6e-be64-4291-bd66-2548210bc566/control-plane-machine-set-operator/0.log" Oct 02 14:01:56 crc kubenswrapper[4929]: I1002 14:01:56.107907 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xvv6_aa6fb363-0b33-4217-82df-61e525432168/kube-rbac-proxy/0.log" Oct 02 14:01:56 crc kubenswrapper[4929]: I1002 14:01:56.196025 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-6xvv6_aa6fb363-0b33-4217-82df-61e525432168/machine-api-operator/0.log" Oct 02 14:01:59 crc kubenswrapper[4929]: I1002 14:01:59.158050 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:01:59 crc kubenswrapper[4929]: E1002 14:01:59.158893 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:02:07 crc kubenswrapper[4929]: I1002 14:02:07.492339 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-g5cwl_392aa533-50c6-4fa9-993d-22ff65870e4a/cert-manager-controller/0.log" Oct 02 14:02:07 crc kubenswrapper[4929]: I1002 14:02:07.647845 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-2746m_ca8b679f-63d7-48c7-bd1b-a37148b857f7/cert-manager-webhook/0.log" Oct 02 14:02:07 crc kubenswrapper[4929]: I1002 14:02:07.705043 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-8xb2l_09157fd6-e304-4dcc-82c3-724f2ff5d23c/cert-manager-cainjector/0.log" Oct 02 14:02:13 crc kubenswrapper[4929]: I1002 14:02:13.157679 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:02:13 crc kubenswrapper[4929]: E1002 14:02:13.158935 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.345346 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-n52n5_fcde8b6a-6d5a-47a8-8d66-78bfc8934410/nmstate-console-plugin/0.log" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.516279 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-xvd9j_d1617fbb-7eac-484e-bf5f-ec3083165f47/kube-rbac-proxy/0.log" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.540695 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hmcq8_a6096daf-c096-473d-b53c-07af5d99a7ee/nmstate-handler/0.log" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.593632 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-xvd9j_d1617fbb-7eac-484e-bf5f-ec3083165f47/nmstate-metrics/0.log" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.739707 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-g6tqv_7364e0e7-ec72-4345-bd2d-09eaf28a4db4/nmstate-operator/0.log" Oct 02 14:02:18 crc kubenswrapper[4929]: I1002 14:02:18.795087 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-q7xjr_45290c56-c1df-4f53-a3ed-796fd6624bab/nmstate-webhook/0.log" Oct 02 14:02:25 crc kubenswrapper[4929]: I1002 14:02:25.156404 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:02:25 crc kubenswrapper[4929]: E1002 14:02:25.157349 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:02:31 crc kubenswrapper[4929]: I1002 14:02:31.757278 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rbghc_31f871c6-7f36-48ee-b3d0-bc4913874419/kube-rbac-proxy/0.log" Oct 02 14:02:31 crc kubenswrapper[4929]: I1002 14:02:31.932672 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-frr-files/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.231827 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rbghc_31f871c6-7f36-48ee-b3d0-bc4913874419/controller/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.314460 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-frr-files/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.384728 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-metrics/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.399380 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-reloader/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.464538 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-reloader/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.677570 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-reloader/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.721818 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-metrics/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.761882 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-frr-files/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.785874 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-metrics/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.997564 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-reloader/0.log" Oct 02 14:02:32 crc kubenswrapper[4929]: I1002 14:02:32.998201 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-frr-files/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.009114 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/cp-metrics/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.083818 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/controller/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.186179 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/frr-metrics/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.238625 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/kube-rbac-proxy/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.298261 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/kube-rbac-proxy-frr/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.392300 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/reloader/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.527494 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-9l74k_81d9a392-bd1f-4577-a81f-df9fa22dbdec/frr-k8s-webhook-server/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.686346 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-77d94b9d4d-669p4_a5e4ea95-81a8-42c4-aa5a-53a414c78b13/manager/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.852749 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-685ffc5d48-q7qgf_5bc6808d-8e32-418c-b479-36e879e768d1/webhook-server/0.log" Oct 02 14:02:33 crc kubenswrapper[4929]: I1002 14:02:33.993351 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2j7bg_501c2d93-5384-46e1-a709-881d7e6eb442/kube-rbac-proxy/0.log" Oct 02 14:02:35 crc kubenswrapper[4929]: I1002 14:02:35.216016 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2j7bg_501c2d93-5384-46e1-a709-881d7e6eb442/speaker/0.log" Oct 02 14:02:36 crc kubenswrapper[4929]: I1002 14:02:36.156882 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:02:36 crc kubenswrapper[4929]: E1002 14:02:36.158034 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:02:36 crc kubenswrapper[4929]: I1002 14:02:36.895828 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mnbch_f8f57c23-45b1-47b0-bc42-33dc9b2f1f53/frr/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.229558 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/util/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.375309 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/util/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.383476 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/pull/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.416504 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/pull/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.617918 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/util/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.650335 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/pull/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.669988 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69c9sc4_c650f07c-274b-4670-b136-d49448f2a3e4/extract/0.log" Oct 02 14:02:47 crc kubenswrapper[4929]: I1002 14:02:47.826909 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.014937 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.022521 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.025296 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.222338 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.239727 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/extract/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.254229 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d29rffm_9c86c553-24cc-44e5-ae1d-0b91e5e44c88/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.407082 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.652809 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.661879 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.665975 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.828238 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/pull/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.852592 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/util/0.log" Oct 02 14:02:48 crc kubenswrapper[4929]: I1002 14:02:48.874469 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dcjw4k_5f3cd1b2-84a0-4f20-bdfc-f4b56d95f255/extract/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.005905 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-utilities/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.156822 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:02:49 crc kubenswrapper[4929]: E1002 14:02:49.157531 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.167129 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-utilities/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.175726 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-content/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.210899 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-content/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.390586 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-utilities/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.396035 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/extract-content/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.633820 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-utilities/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.870838 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-utilities/0.log" Oct 02 14:02:49 crc kubenswrapper[4929]: I1002 14:02:49.994910 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-content/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.078341 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-content/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.164839 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-utilities/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.197562 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/extract-content/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.263556 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fxvtg_6abdb15b-9bdf-4ba4-aaf0-899b30be9d78/registry-server/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.485175 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/util/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.703055 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/pull/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.746617 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/util/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.760811 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/pull/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.938231 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/util/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.992007 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/pull/0.log" Oct 02 14:02:50 crc kubenswrapper[4929]: I1002 14:02:50.995104 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cscs5v_0986f61a-c4cc-426b-861c-343816225e99/extract/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.236432 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-utilities/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.279804 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-c4q96_9d676ad8-1218-41d5-b194-aa15bf42d384/marketplace-operator/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.643125 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-content/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.769785 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-content/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.769848 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-utilities/0.log" Oct 02 14:02:51 crc kubenswrapper[4929]: I1002 14:02:51.913195 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-content/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.076780 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/extract-utilities/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.101715 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-r2mws_8107f56e-8ec5-4eee-a71a-49d929d35a2d/registry-server/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.161690 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-utilities/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.466006 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-content/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.541483 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-content/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.554759 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-utilities/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.578335 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvtt6_01d6460b-931e-456e-ae3d-8b9216249c60/registry-server/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.726323 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-utilities/0.log" Oct 02 14:02:52 crc kubenswrapper[4929]: I1002 14:02:52.779244 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/extract-content/0.log" Oct 02 14:02:53 crc kubenswrapper[4929]: I1002 14:02:53.177017 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7dlvd_50083b96-5603-412c-b283-430e43790b81/registry-server/0.log" Oct 02 14:03:02 crc kubenswrapper[4929]: I1002 14:03:02.156552 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:03:02 crc kubenswrapper[4929]: E1002 14:03:02.157458 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:03:04 crc kubenswrapper[4929]: I1002 14:03:04.016565 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-qrjq5_efb8bd58-0f3b-44ec-900a-12e77329a35d/prometheus-operator/0.log" Oct 02 14:03:04 crc kubenswrapper[4929]: I1002 14:03:04.201753 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694946c4b4-lqt5c_068f6f40-2d52-4600-a441-df5ca2543dad/prometheus-operator-admission-webhook/0.log" Oct 02 14:03:04 crc kubenswrapper[4929]: I1002 14:03:04.251392 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694946c4b4-pvwr8_37fba1c2-0ff5-44e6-8192-14c4ba4c4e22/prometheus-operator-admission-webhook/0.log" Oct 02 14:03:04 crc kubenswrapper[4929]: I1002 14:03:04.379116 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-fmst2_3f310c2b-2d4a-4538-a08e-d87f3da76b2f/operator/0.log" Oct 02 14:03:04 crc kubenswrapper[4929]: I1002 14:03:04.456602 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-bqlh9_04cddf71-5b78-47a3-a3a9-a7e65d8fe1aa/perses-operator/0.log" Oct 02 14:03:16 crc kubenswrapper[4929]: I1002 14:03:16.160076 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:03:16 crc kubenswrapper[4929]: E1002 14:03:16.161254 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:03:28 crc kubenswrapper[4929]: I1002 14:03:28.157811 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:03:28 crc kubenswrapper[4929]: E1002 14:03:28.158606 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:03:42 crc kubenswrapper[4929]: I1002 14:03:42.176282 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:03:42 crc kubenswrapper[4929]: E1002 14:03:42.176969 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:03:55 crc kubenswrapper[4929]: I1002 14:03:55.158150 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:03:55 crc kubenswrapper[4929]: E1002 14:03:55.159115 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:04:09 crc kubenswrapper[4929]: I1002 14:04:09.157565 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:04:09 crc kubenswrapper[4929]: E1002 14:04:09.158592 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:04:23 crc kubenswrapper[4929]: I1002 14:04:23.157646 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:04:23 crc kubenswrapper[4929]: E1002 14:04:23.158465 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:04:35 crc kubenswrapper[4929]: I1002 14:04:35.156583 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:04:35 crc kubenswrapper[4929]: E1002 14:04:35.157270 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:04:46 crc kubenswrapper[4929]: I1002 14:04:46.157779 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:04:46 crc kubenswrapper[4929]: E1002 14:04:46.158877 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:04:58 crc kubenswrapper[4929]: I1002 14:04:58.157516 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:04:58 crc kubenswrapper[4929]: E1002 14:04:58.158195 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:05:12 crc kubenswrapper[4929]: I1002 14:05:12.160578 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:05:12 crc kubenswrapper[4929]: E1002 14:05:12.161397 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:05:24 crc kubenswrapper[4929]: I1002 14:05:24.157785 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:05:24 crc kubenswrapper[4929]: E1002 14:05:24.159423 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:05:39 crc kubenswrapper[4929]: I1002 14:05:39.156758 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:05:39 crc kubenswrapper[4929]: E1002 14:05:39.157504 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:05:52 crc kubenswrapper[4929]: I1002 14:05:52.162279 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:05:52 crc kubenswrapper[4929]: E1002 14:05:52.163012 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:05:58 crc kubenswrapper[4929]: I1002 14:05:58.563091 4929 generic.go:334] "Generic (PLEG): container finished" podID="ff029149-4944-4650-9351-69aa7531e9a6" containerID="5b2a31ae770dc6273b7f8a17f720ab66d7233f59d21285c19adce21563bc8931" exitCode=0 Oct 02 14:05:58 crc kubenswrapper[4929]: I1002 14:05:58.563187 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" event={"ID":"ff029149-4944-4650-9351-69aa7531e9a6","Type":"ContainerDied","Data":"5b2a31ae770dc6273b7f8a17f720ab66d7233f59d21285c19adce21563bc8931"} Oct 02 14:05:58 crc kubenswrapper[4929]: I1002 14:05:58.564364 4929 scope.go:117] "RemoveContainer" containerID="5b2a31ae770dc6273b7f8a17f720ab66d7233f59d21285c19adce21563bc8931" Oct 02 14:05:58 crc kubenswrapper[4929]: I1002 14:05:58.643402 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6j7s_must-gather-wwcqc_ff029149-4944-4650-9351-69aa7531e9a6/gather/0.log" Oct 02 14:06:06 crc kubenswrapper[4929]: I1002 14:06:06.157184 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:06:06 crc kubenswrapper[4929]: E1002 14:06:06.158751 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:06:07 crc kubenswrapper[4929]: I1002 14:06:07.510299 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6j7s/must-gather-wwcqc"] Oct 02 14:06:07 crc kubenswrapper[4929]: I1002 14:06:07.510852 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="copy" containerID="cri-o://88aa457f10c0f60757fe1ae770b191cb525348b7d9c6cb54008e5fd47787175c" gracePeriod=2 Oct 02 14:06:07 crc kubenswrapper[4929]: I1002 14:06:07.521715 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6j7s/must-gather-wwcqc"] Oct 02 14:06:07 crc kubenswrapper[4929]: I1002 14:06:07.660001 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6j7s_must-gather-wwcqc_ff029149-4944-4650-9351-69aa7531e9a6/copy/0.log" Oct 02 14:06:07 crc kubenswrapper[4929]: I1002 14:06:07.660593 4929 generic.go:334] "Generic (PLEG): container finished" podID="ff029149-4944-4650-9351-69aa7531e9a6" containerID="88aa457f10c0f60757fe1ae770b191cb525348b7d9c6cb54008e5fd47787175c" exitCode=143 Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.349836 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6j7s_must-gather-wwcqc_ff029149-4944-4650-9351-69aa7531e9a6/copy/0.log" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.350734 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.487581 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output\") pod \"ff029149-4944-4650-9351-69aa7531e9a6\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.487979 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbp5\" (UniqueName: \"kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5\") pod \"ff029149-4944-4650-9351-69aa7531e9a6\" (UID: \"ff029149-4944-4650-9351-69aa7531e9a6\") " Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.497056 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5" (OuterVolumeSpecName: "kube-api-access-xcbp5") pod "ff029149-4944-4650-9351-69aa7531e9a6" (UID: "ff029149-4944-4650-9351-69aa7531e9a6"). InnerVolumeSpecName "kube-api-access-xcbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.591397 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbp5\" (UniqueName: \"kubernetes.io/projected/ff029149-4944-4650-9351-69aa7531e9a6-kube-api-access-xcbp5\") on node \"crc\" DevicePath \"\"" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.682172 4929 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6j7s_must-gather-wwcqc_ff029149-4944-4650-9351-69aa7531e9a6/copy/0.log" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.682613 4929 scope.go:117] "RemoveContainer" containerID="88aa457f10c0f60757fe1ae770b191cb525348b7d9c6cb54008e5fd47787175c" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.682755 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6j7s/must-gather-wwcqc" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.724046 4929 scope.go:117] "RemoveContainer" containerID="5b2a31ae770dc6273b7f8a17f720ab66d7233f59d21285c19adce21563bc8931" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.786478 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ff029149-4944-4650-9351-69aa7531e9a6" (UID: "ff029149-4944-4650-9351-69aa7531e9a6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 14:06:08 crc kubenswrapper[4929]: I1002 14:06:08.798160 4929 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff029149-4944-4650-9351-69aa7531e9a6-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 14:06:10 crc kubenswrapper[4929]: I1002 14:06:10.172108 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff029149-4944-4650-9351-69aa7531e9a6" path="/var/lib/kubelet/pods/ff029149-4944-4650-9351-69aa7531e9a6/volumes" Oct 02 14:06:20 crc kubenswrapper[4929]: I1002 14:06:20.163480 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:06:20 crc kubenswrapper[4929]: E1002 14:06:20.164223 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.251003 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:29 crc kubenswrapper[4929]: E1002 14:06:29.252081 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="gather" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252096 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="gather" Oct 02 14:06:29 crc kubenswrapper[4929]: E1002 14:06:29.252135 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30742474-b02a-43d0-9356-6b41f1af7981" containerName="container-00" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252143 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="30742474-b02a-43d0-9356-6b41f1af7981" containerName="container-00" Oct 02 14:06:29 crc kubenswrapper[4929]: E1002 14:06:29.252175 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="copy" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252182 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="copy" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252366 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="30742474-b02a-43d0-9356-6b41f1af7981" containerName="container-00" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252381 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="copy" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.252398 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff029149-4944-4650-9351-69aa7531e9a6" containerName="gather" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.254209 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.292906 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.387584 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.387710 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxxl9\" (UniqueName: \"kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.387846 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.489978 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.490063 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxxl9\" (UniqueName: \"kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.490191 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.490509 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.490605 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.514972 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxxl9\" (UniqueName: \"kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9\") pod \"redhat-operators-cks54\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:29 crc kubenswrapper[4929]: I1002 14:06:29.578427 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:30 crc kubenswrapper[4929]: I1002 14:06:30.043384 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:30 crc kubenswrapper[4929]: I1002 14:06:30.971543 4929 generic.go:334] "Generic (PLEG): container finished" podID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerID="167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb" exitCode=0 Oct 02 14:06:30 crc kubenswrapper[4929]: I1002 14:06:30.971716 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerDied","Data":"167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb"} Oct 02 14:06:30 crc kubenswrapper[4929]: I1002 14:06:30.971891 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerStarted","Data":"a4dff80c97585df55c8e7c106890770120281beb4653e889a88aae7533ce3135"} Oct 02 14:06:30 crc kubenswrapper[4929]: I1002 14:06:30.973894 4929 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 14:06:31 crc kubenswrapper[4929]: I1002 14:06:31.156877 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:06:31 crc kubenswrapper[4929]: E1002 14:06:31.157401 4929 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8j488_openshift-machine-config-operator(1b4b5329-0385-4f39-9d63-70284421e448)\"" pod="openshift-machine-config-operator/machine-config-daemon-8j488" podUID="1b4b5329-0385-4f39-9d63-70284421e448" Oct 02 14:06:32 crc kubenswrapper[4929]: I1002 14:06:32.993272 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerStarted","Data":"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8"} Oct 02 14:06:35 crc kubenswrapper[4929]: I1002 14:06:35.015182 4929 generic.go:334] "Generic (PLEG): container finished" podID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerID="1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8" exitCode=0 Oct 02 14:06:35 crc kubenswrapper[4929]: I1002 14:06:35.015268 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerDied","Data":"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8"} Oct 02 14:06:36 crc kubenswrapper[4929]: I1002 14:06:36.033396 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerStarted","Data":"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3"} Oct 02 14:06:36 crc kubenswrapper[4929]: I1002 14:06:36.072633 4929 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cks54" podStartSLOduration=2.380802529 podStartE2EDuration="7.072612685s" podCreationTimestamp="2025-10-02 14:06:29 +0000 UTC" firstStartedPulling="2025-10-02 14:06:30.973672184 +0000 UTC m=+10591.524038548" lastFinishedPulling="2025-10-02 14:06:35.66548234 +0000 UTC m=+10596.215848704" observedRunningTime="2025-10-02 14:06:36.06829133 +0000 UTC m=+10596.618657734" watchObservedRunningTime="2025-10-02 14:06:36.072612685 +0000 UTC m=+10596.622979049" Oct 02 14:06:39 crc kubenswrapper[4929]: I1002 14:06:39.579341 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:39 crc kubenswrapper[4929]: I1002 14:06:39.579632 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:40 crc kubenswrapper[4929]: I1002 14:06:40.631363 4929 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cks54" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="registry-server" probeResult="failure" output=< Oct 02 14:06:40 crc kubenswrapper[4929]: timeout: failed to connect service ":50051" within 1s Oct 02 14:06:40 crc kubenswrapper[4929]: > Oct 02 14:06:45 crc kubenswrapper[4929]: I1002 14:06:45.156692 4929 scope.go:117] "RemoveContainer" containerID="e4b9cb08f34cfa004aa97f9ef945b558e36c762686212cfc1d185a619128fa17" Oct 02 14:06:46 crc kubenswrapper[4929]: I1002 14:06:46.137777 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8j488" event={"ID":"1b4b5329-0385-4f39-9d63-70284421e448","Type":"ContainerStarted","Data":"3e7e8357f6bd04ab59462e225760611342bebb073916f27e7bf3339766a33bda"} Oct 02 14:06:49 crc kubenswrapper[4929]: I1002 14:06:49.635044 4929 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:49 crc kubenswrapper[4929]: I1002 14:06:49.692571 4929 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:49 crc kubenswrapper[4929]: I1002 14:06:49.875869 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.212361 4929 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cks54" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="registry-server" containerID="cri-o://0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3" gracePeriod=2 Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.716262 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.869712 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content\") pod \"f49fa667-36a0-4300-b78b-f6a92b499dec\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.869841 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxxl9\" (UniqueName: \"kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9\") pod \"f49fa667-36a0-4300-b78b-f6a92b499dec\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.869909 4929 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities\") pod \"f49fa667-36a0-4300-b78b-f6a92b499dec\" (UID: \"f49fa667-36a0-4300-b78b-f6a92b499dec\") " Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.872403 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities" (OuterVolumeSpecName: "utilities") pod "f49fa667-36a0-4300-b78b-f6a92b499dec" (UID: "f49fa667-36a0-4300-b78b-f6a92b499dec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.882541 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9" (OuterVolumeSpecName: "kube-api-access-dxxl9") pod "f49fa667-36a0-4300-b78b-f6a92b499dec" (UID: "f49fa667-36a0-4300-b78b-f6a92b499dec"). InnerVolumeSpecName "kube-api-access-dxxl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.972518 4929 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxxl9\" (UniqueName: \"kubernetes.io/projected/f49fa667-36a0-4300-b78b-f6a92b499dec-kube-api-access-dxxl9\") on node \"crc\" DevicePath \"\"" Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.972556 4929 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 14:06:51 crc kubenswrapper[4929]: I1002 14:06:51.981195 4929 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f49fa667-36a0-4300-b78b-f6a92b499dec" (UID: "f49fa667-36a0-4300-b78b-f6a92b499dec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.074380 4929 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49fa667-36a0-4300-b78b-f6a92b499dec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.227085 4929 generic.go:334] "Generic (PLEG): container finished" podID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerID="0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3" exitCode=0 Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.227175 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerDied","Data":"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3"} Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.227544 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cks54" event={"ID":"f49fa667-36a0-4300-b78b-f6a92b499dec","Type":"ContainerDied","Data":"a4dff80c97585df55c8e7c106890770120281beb4653e889a88aae7533ce3135"} Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.227573 4929 scope.go:117] "RemoveContainer" containerID="0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.227192 4929 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cks54" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.258657 4929 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.262149 4929 scope.go:117] "RemoveContainer" containerID="1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.270922 4929 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cks54"] Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.295339 4929 scope.go:117] "RemoveContainer" containerID="167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.345372 4929 scope.go:117] "RemoveContainer" containerID="0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3" Oct 02 14:06:52 crc kubenswrapper[4929]: E1002 14:06:52.345912 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3\": container with ID starting with 0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3 not found: ID does not exist" containerID="0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.346048 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3"} err="failed to get container status \"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3\": rpc error: code = NotFound desc = could not find container \"0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3\": container with ID starting with 0f2ee7ac4a56f7c941c2213852124b2f048f04fbc0c0bf3118666f0b668d65e3 not found: ID does not exist" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.346079 4929 scope.go:117] "RemoveContainer" containerID="1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8" Oct 02 14:06:52 crc kubenswrapper[4929]: E1002 14:06:52.346468 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8\": container with ID starting with 1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8 not found: ID does not exist" containerID="1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.346518 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8"} err="failed to get container status \"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8\": rpc error: code = NotFound desc = could not find container \"1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8\": container with ID starting with 1b38ac17f6d56425ae0283613b98614be137bab74fb402797e5b605c728e58d8 not found: ID does not exist" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.346574 4929 scope.go:117] "RemoveContainer" containerID="167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb" Oct 02 14:06:52 crc kubenswrapper[4929]: E1002 14:06:52.346927 4929 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb\": container with ID starting with 167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb not found: ID does not exist" containerID="167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb" Oct 02 14:06:52 crc kubenswrapper[4929]: I1002 14:06:52.346949 4929 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb"} err="failed to get container status \"167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb\": rpc error: code = NotFound desc = could not find container \"167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb\": container with ID starting with 167f4f3126585fa5879b64955b2105bef638c19b7e31db3f0f1dcc0fa49e9dcb not found: ID does not exist" Oct 02 14:06:54 crc kubenswrapper[4929]: I1002 14:06:54.171150 4929 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" path="/var/lib/kubelet/pods/f49fa667-36a0-4300-b78b-f6a92b499dec/volumes" Oct 02 14:07:30 crc kubenswrapper[4929]: I1002 14:07:30.196217 4929 scope.go:117] "RemoveContainer" containerID="671acdb51100e6246af929d3fd7cb31df8499249a0989a0a4ba28bf8e60488bb" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.047280 4929 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vntzj"] Oct 02 14:08:53 crc kubenswrapper[4929]: E1002 14:08:53.048444 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="extract-utilities" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.048458 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="extract-utilities" Oct 02 14:08:53 crc kubenswrapper[4929]: E1002 14:08:53.048474 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="extract-content" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.048482 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="extract-content" Oct 02 14:08:53 crc kubenswrapper[4929]: E1002 14:08:53.048509 4929 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="registry-server" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.048517 4929 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="registry-server" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.048888 4929 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49fa667-36a0-4300-b78b-f6a92b499dec" containerName="registry-server" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.051168 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.058864 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vntzj"] Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.089675 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-catalog-content\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.089847 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-utilities\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.089899 4929 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7qz\" (UniqueName: \"kubernetes.io/projected/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-kube-api-access-hk7qz\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.192466 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-catalog-content\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.192537 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-utilities\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.192560 4929 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7qz\" (UniqueName: \"kubernetes.io/projected/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-kube-api-access-hk7qz\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.193347 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-catalog-content\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.193564 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-utilities\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.228830 4929 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7qz\" (UniqueName: \"kubernetes.io/projected/09b1f89e-7dc4-47a1-b0cf-418db0efb17b-kube-api-access-hk7qz\") pod \"community-operators-vntzj\" (UID: \"09b1f89e-7dc4-47a1-b0cf-418db0efb17b\") " pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.379957 4929 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vntzj" Oct 02 14:08:53 crc kubenswrapper[4929]: I1002 14:08:53.919278 4929 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vntzj"] Oct 02 14:08:54 crc kubenswrapper[4929]: I1002 14:08:54.530297 4929 generic.go:334] "Generic (PLEG): container finished" podID="09b1f89e-7dc4-47a1-b0cf-418db0efb17b" containerID="ad7c0c5643106398ec51d57770227ad25684e7dc23d5c2b599a9f824aef99d65" exitCode=0 Oct 02 14:08:54 crc kubenswrapper[4929]: I1002 14:08:54.530410 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vntzj" event={"ID":"09b1f89e-7dc4-47a1-b0cf-418db0efb17b","Type":"ContainerDied","Data":"ad7c0c5643106398ec51d57770227ad25684e7dc23d5c2b599a9f824aef99d65"} Oct 02 14:08:54 crc kubenswrapper[4929]: I1002 14:08:54.530589 4929 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vntzj" event={"ID":"09b1f89e-7dc4-47a1-b0cf-418db0efb17b","Type":"ContainerStarted","Data":"265cc5a47d7f4c27e4abfe475ecfb61a39f9c79fe3faf94a8c9b07e17a466495"}